Apr 17 16:31:10.220112 ip-10-0-136-182 systemd[1]: Starting Kubernetes Kubelet... Apr 17 16:31:10.688289 ip-10-0-136-182 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:10.688289 ip-10-0-136-182 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 16:31:10.688289 ip-10-0-136-182 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:10.688289 ip-10-0-136-182 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 16:31:10.688289 ip-10-0-136-182 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:10.690528 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.690439 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 16:31:10.694940 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694916 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:10.694940 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694935 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:10.694940 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694939 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:10.694940 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694943 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:10.694940 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694946 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:10.694940 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694949 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694952 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694955 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694960 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694962 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694965 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694968 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694971 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694974 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694976 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694980 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694983 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694985 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694988 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694990 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694994 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694996 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.694999 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695002 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695004 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:10.695172 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695007 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695017 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695020 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695024 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695027 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695030 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695033 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695035 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695038 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695041 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695044 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695048 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695052 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695055 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695059 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695061 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695064 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695067 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695070 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695072 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:10.695645 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695075 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695078 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695081 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695083 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695086 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695089 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695091 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695094 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695096 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695099 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695101 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695104 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695108 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695112 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695115 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695118 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695122 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695125 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695127 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:10.696153 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695131 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695133 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695136 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695139 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695141 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695144 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695146 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695150 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695153 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695156 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695159 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695161 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695163 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695166 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695170 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695173 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695176 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695178 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695181 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:10.696611 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695184 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695187 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695189 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695603 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695608 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695611 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695614 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695617 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695619 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695622 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695625 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695627 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695630 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695633 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695636 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695638 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695641 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695644 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695647 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695649 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:10.697083 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695653 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695656 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695659 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695662 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695664 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695667 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695669 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695673 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695676 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695678 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695681 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695683 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695686 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695688 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695691 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695693 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695695 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695698 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695700 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:10.697558 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695703 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695705 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695708 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695711 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695713 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695716 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695719 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695721 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695725 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695729 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695732 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695735 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695738 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695741 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695744 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695746 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695749 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695751 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695754 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695756 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:10.698050 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695759 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695763 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695766 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695768 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695771 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695773 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695776 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695778 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695781 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695783 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695787 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695791 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695793 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695796 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695814 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695818 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695820 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695823 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695826 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:10.698540 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695828 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695831 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695833 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695836 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695839 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695842 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695845 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695848 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695850 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695853 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.695855 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697667 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697678 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697686 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697691 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697696 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697699 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697704 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697708 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697712 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697715 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 16:31:10.699013 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697718 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697722 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697725 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697728 2569 flags.go:64] FLAG: --cgroup-root="" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697731 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697734 2569 flags.go:64] FLAG: --client-ca-file="" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697737 2569 flags.go:64] FLAG: --cloud-config="" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697741 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697744 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697748 2569 flags.go:64] FLAG: --cluster-domain="" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697751 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697754 2569 flags.go:64] FLAG: --config-dir="" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697756 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697760 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697763 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697766 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697769 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697773 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697776 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697779 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697782 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697785 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697788 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697792 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697795 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 16:31:10.699516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697812 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697815 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697820 2569 flags.go:64] FLAG: --enable-server="true" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697823 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697829 2569 flags.go:64] FLAG: --event-burst="100" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697832 2569 flags.go:64] FLAG: --event-qps="50" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697835 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697838 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697841 2569 flags.go:64] FLAG: --eviction-hard="" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697845 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697848 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697851 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697855 2569 flags.go:64] FLAG: --eviction-soft="" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697858 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697861 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697864 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697867 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697870 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697873 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697876 2569 flags.go:64] FLAG: --feature-gates="" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697885 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697888 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697891 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697895 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697898 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 17 16:31:10.700232 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697901 2569 flags.go:64] FLAG: --help="false" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697904 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-136-182.ec2.internal" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697907 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697910 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697912 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697916 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697920 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697923 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697926 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697929 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697935 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697938 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697941 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697944 2569 flags.go:64] FLAG: --kube-reserved="" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697947 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697950 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697957 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697960 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697963 2569 flags.go:64] FLAG: --lock-file="" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697966 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697969 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697972 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697977 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 16:31:10.700854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697980 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697984 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697987 2569 flags.go:64] FLAG: --logging-format="text" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697990 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697993 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697996 2569 flags.go:64] FLAG: --manifest-url="" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.697999 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698003 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698006 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698011 2569 flags.go:64] FLAG: --max-pods="110" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698014 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698016 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698019 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698022 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698026 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698029 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698032 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698040 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698043 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698046 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698049 2569 flags.go:64] FLAG: --pod-cidr="" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698052 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698058 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698061 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 16:31:10.701446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698065 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698068 2569 flags.go:64] FLAG: --port="10250" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698071 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698074 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01c354cb3ef6e19d9" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698078 2569 flags.go:64] FLAG: --qos-reserved="" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698080 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698083 2569 flags.go:64] FLAG: --register-node="true" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698086 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698090 2569 flags.go:64] FLAG: --register-with-taints="" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698093 2569 flags.go:64] FLAG: --registry-burst="10" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698096 2569 flags.go:64] FLAG: --registry-qps="5" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698099 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698102 2569 flags.go:64] FLAG: --reserved-memory="" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698106 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698109 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698112 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698114 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698117 2569 flags.go:64] FLAG: --runonce="false" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698120 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698123 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698126 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698128 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698131 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698134 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698137 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698140 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 16:31:10.702035 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698143 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698145 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698148 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698152 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698155 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698157 2569 flags.go:64] FLAG: --system-cgroups="" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698160 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698166 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698169 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698172 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698176 2569 flags.go:64] FLAG: --tls-min-version="" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698179 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698183 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698186 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698189 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698192 2569 flags.go:64] FLAG: --v="2" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698197 2569 flags.go:64] FLAG: --version="false" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698201 2569 flags.go:64] FLAG: --vmodule="" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698206 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.698209 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698315 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698319 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698322 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698325 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:10.702700 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698327 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698330 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698332 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698335 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698338 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698341 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698343 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698346 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698348 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698350 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698353 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698356 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698359 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698361 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698365 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698368 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698371 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698374 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698376 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:10.703293 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698379 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698383 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698386 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698388 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698391 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698393 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698396 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698398 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698401 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698404 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698406 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698409 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698412 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698414 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698417 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698419 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698422 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698424 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698427 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698429 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:10.703764 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698432 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698434 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698437 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698439 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698442 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698444 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698447 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698450 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698452 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698455 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698457 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698460 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698462 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698466 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698469 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698472 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698474 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698476 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698479 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698481 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:10.704291 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698484 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698486 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698489 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698491 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698493 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698496 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698498 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698501 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698503 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698506 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698508 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698511 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698513 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698516 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698518 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698520 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698523 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698525 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698529 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:10.704786 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698533 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:10.705267 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698536 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:10.705267 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698540 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:10.705267 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.698543 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:10.705267 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.699192 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:10.705590 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.705570 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 16:31:10.705625 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.705591 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 16:31:10.705653 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705641 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:10.705653 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705647 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:10.705653 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705650 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:10.705653 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705653 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705657 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705660 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705663 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705666 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705668 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705671 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705674 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705676 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705679 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705681 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705684 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705688 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705690 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705693 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705695 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705699 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705701 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705704 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705707 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:10.705754 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705709 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705712 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705715 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705717 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705720 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705722 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705725 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705727 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705730 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705733 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705736 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705739 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705741 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705744 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705747 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705750 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705752 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705756 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705759 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:10.706273 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705761 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705764 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705767 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705769 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705772 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705774 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705777 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705779 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705782 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705785 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705787 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705790 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705793 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705795 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705811 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705816 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705819 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705821 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705824 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705827 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:10.706768 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705829 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705832 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705835 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705838 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705840 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705843 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705846 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705849 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705852 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705855 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705857 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705860 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705862 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705865 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705867 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705872 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705876 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705880 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705884 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:10.707277 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705887 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:10.707733 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705890 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:10.707733 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705894 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:10.707733 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705897 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:10.707733 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.705899 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:10.707733 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.705905 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:10.707733 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706008 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:10.707733 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706013 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:10.707733 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706016 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:10.707733 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706019 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:10.707733 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706021 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:10.707733 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706024 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:10.707733 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706026 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:10.707733 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706029 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:10.707733 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706031 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:10.707733 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706034 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706038 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706042 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706045 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706049 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706053 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706056 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706058 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706061 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706063 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706066 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706069 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706071 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706074 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706076 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706079 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706081 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706084 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706087 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:10.708134 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706089 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706092 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706095 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706097 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706100 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706102 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706105 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706107 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706110 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706113 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706115 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706118 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706120 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706122 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706125 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706128 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706131 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706133 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706137 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706139 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:10.708594 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706142 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706145 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706147 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706150 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706152 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706155 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706157 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706160 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706162 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706165 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706167 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706170 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706172 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706175 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706177 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706179 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706182 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706184 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706187 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706189 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:10.709097 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706192 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:10.709575 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706194 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:10.709575 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706196 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:10.709575 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706199 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:10.709575 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706201 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:10.709575 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706203 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:10.709575 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706206 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:10.709575 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706208 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:10.709575 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706211 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:10.709575 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706214 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:10.709575 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706217 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:10.709575 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706219 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:10.709575 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706221 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:10.709575 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706224 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:10.709575 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706227 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:10.709575 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706229 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:10.709575 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706232 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:10.709575 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:10.706234 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:10.710008 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.706239 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:10.710008 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.706943 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 16:31:10.710008 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.708939 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 16:31:10.710008 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.709996 2569 server.go:1019] "Starting client certificate rotation" Apr 17 16:31:10.710118 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.710092 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:10.710906 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.710894 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:10.734835 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.734794 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:10.737345 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.737322 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:10.751975 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.751947 2569 log.go:25] "Validated CRI v1 runtime API" Apr 17 16:31:10.758023 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.758007 2569 log.go:25] "Validated CRI v1 image API" Apr 17 16:31:10.759957 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.759942 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 16:31:10.766986 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.766960 2569 fs.go:135] Filesystem UUIDs: map[62c51453-99f6-4e4e-aa70-9e0de4c59650:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 8d738a09-715d-4e1d-8c31-90118324284f:/dev/nvme0n1p4] Apr 17 16:31:10.767058 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.766988 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 16:31:10.773333 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.773216 2569 manager.go:217] Machine: {Timestamp:2026-04-17 16:31:10.771089842 +0000 UTC m=+0.427897048 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3185667 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2244e336498f234064370e51ed3fd2 SystemUUID:ec2244e3-3649-8f23-4064-370e51ed3fd2 BootID:480f6fb0-1bdc-48db-ab8e-3105f960fa82 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7e:06:18:f1:b7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7e:06:18:f1:b7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:e2:37:d6:08:44:e5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 16:31:10.773333 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.773327 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 16:31:10.773452 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.773423 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 16:31:10.773682 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.773663 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:10.774587 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.774559 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 16:31:10.774734 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.774589 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-182.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 16:31:10.774782 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.774743 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 16:31:10.774782 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.774752 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 16:31:10.774782 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.774764 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:10.775745 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.775735 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:10.777146 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.777135 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:10.777258 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.777249 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 16:31:10.782165 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.782154 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 17 16:31:10.782199 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.782170 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 16:31:10.782199 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.782186 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 16:31:10.782199 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.782196 2569 kubelet.go:397] "Adding apiserver pod source" Apr 17 16:31:10.782339 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.782205 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 16:31:10.783306 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.783291 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:10.783389 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.783311 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:10.786586 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.786568 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 16:31:10.787966 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.787953 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 16:31:10.789376 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.789361 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 16:31:10.789449 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.789386 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 16:31:10.789449 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.789392 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 16:31:10.789449 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.789399 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 16:31:10.789449 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.789409 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 16:31:10.789449 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.789416 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 16:31:10.789449 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.789422 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 16:31:10.789449 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.789427 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 16:31:10.789449 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.789435 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 16:31:10.789449 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.789442 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 16:31:10.789688 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.789460 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 16:31:10.789688 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.789469 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 16:31:10.791378 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.791367 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 16:31:10.791437 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.791381 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 16:31:10.794517 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.794487 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-182.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 16:31:10.794610 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:10.794578 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-182.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 16:31:10.794610 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:10.794574 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 16:31:10.795306 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.795293 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 16:31:10.795345 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.795328 2569 server.go:1295] "Started kubelet" Apr 17 16:31:10.795458 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.795420 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 16:31:10.795516 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.795484 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 16:31:10.795558 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.795440 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 16:31:10.796312 ip-10-0-136-182 systemd[1]: Started Kubernetes Kubelet. Apr 17 16:31:10.796502 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.796480 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 16:31:10.797653 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.797639 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 17 16:31:10.802151 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.802123 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:10.802687 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.802661 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 16:31:10.803692 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.803487 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 16:31:10.803692 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.803510 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 16:31:10.803692 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.803654 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 17 16:31:10.803692 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.803661 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 17 16:31:10.804834 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:10.804621 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-182.ec2.internal\" not found" Apr 17 16:31:10.804957 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.804947 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 16:31:10.806595 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:10.806572 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 16:31:10.806922 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:10.806898 2569 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-182.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 16:31:10.807032 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:10.806988 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 16:31:10.807178 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.805967 2569 factory.go:55] Registering systemd factory Apr 17 16:31:10.807352 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.807331 2569 factory.go:223] Registration of the systemd container factory successfully Apr 17 16:31:10.807614 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.807598 2569 factory.go:153] Registering CRI-O factory Apr 17 16:31:10.807675 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.807618 2569 factory.go:223] Registration of the crio container factory successfully Apr 17 16:31:10.807746 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.807732 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 16:31:10.807827 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.807766 2569 factory.go:103] Registering Raw factory Apr 17 16:31:10.807827 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.807785 2569 manager.go:1196] Started watching for new ooms in manager Apr 17 16:31:10.808254 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.808240 2569 manager.go:319] Starting recovery of all containers Apr 17 16:31:10.810706 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:10.806514 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-182.ec2.internal.18a731ecb850f1f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-182.ec2.internal,UID:ip-10-0-136-182.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-182.ec2.internal,},FirstTimestamp:2026-04-17 16:31:10.795305456 +0000 UTC m=+0.452112659,LastTimestamp:2026-04-17 16:31:10.795305456 +0000 UTC m=+0.452112659,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-182.ec2.internal,}" Apr 17 16:31:10.817428 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.817394 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-x4hvp" Apr 17 16:31:10.820006 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.819981 2569 manager.go:324] Recovery completed Apr 17 16:31:10.824157 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.824142 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:10.825391 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.825376 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-x4hvp" Apr 17 16:31:10.826767 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.826754 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:10.826841 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.826781 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:10.826841 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.826792 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:10.827346 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.827330 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 16:31:10.827346 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.827343 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 16:31:10.827445 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.827360 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:10.828874 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:10.828795 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-182.ec2.internal.18a731ecba31052c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-182.ec2.internal,UID:ip-10-0-136-182.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-136-182.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-136-182.ec2.internal,},FirstTimestamp:2026-04-17 16:31:10.82676766 +0000 UTC m=+0.483574859,LastTimestamp:2026-04-17 16:31:10.82676766 +0000 UTC m=+0.483574859,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-182.ec2.internal,}" Apr 17 16:31:10.829664 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.829651 2569 policy_none.go:49] "None policy: Start" Apr 17 16:31:10.829664 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.829667 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 16:31:10.829753 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.829677 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 17 16:31:10.875457 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.875438 2569 manager.go:341] "Starting Device Plugin manager" Apr 17 16:31:10.878571 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:10.875471 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 16:31:10.878571 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.875483 2569 server.go:85] "Starting device plugin registration server" Apr 17 16:31:10.878571 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.875748 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 16:31:10.878571 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.875762 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 16:31:10.878571 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.875906 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 16:31:10.878571 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.875983 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 16:31:10.878571 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.875991 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 16:31:10.878571 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:10.876489 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 16:31:10.878571 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:10.876528 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-182.ec2.internal\" not found" Apr 17 16:31:10.907966 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.907936 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 16:31:10.909191 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.909171 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 16:31:10.909258 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.909206 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 16:31:10.909258 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.909231 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 16:31:10.909258 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.909250 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 16:31:10.909384 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:10.909292 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 16:31:10.912292 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.912269 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:10.976104 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.976022 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:10.977235 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.977218 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:10.977306 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.977250 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:10.977306 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.977260 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:10.977306 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.977293 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-182.ec2.internal" Apr 17 16:31:10.989176 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:10.989156 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-182.ec2.internal" Apr 17 16:31:10.989176 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:10.989180 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-182.ec2.internal\": node \"ip-10-0-136-182.ec2.internal\" not found" Apr 17 16:31:11.006625 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:11.006597 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-182.ec2.internal\" not found" Apr 17 16:31:11.009750 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.009733 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-182.ec2.internal"] Apr 17 16:31:11.009847 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.009834 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:11.011887 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.011869 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:11.011968 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.011897 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:11.011968 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.011909 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:11.013021 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.013008 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:11.013186 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.013172 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal" Apr 17 16:31:11.013220 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.013201 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:11.013787 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.013772 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:11.013845 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.013820 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:11.013845 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.013834 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:11.013920 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.013880 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:11.013920 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.013898 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:11.013920 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.013915 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:11.015025 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.015010 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-182.ec2.internal" Apr 17 16:31:11.015084 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.015041 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:11.015825 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.015795 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:11.015905 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.015840 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:11.015905 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.015868 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:11.030332 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:11.030314 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-182.ec2.internal\" not found" node="ip-10-0-136-182.ec2.internal" Apr 17 16:31:11.034732 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:11.034713 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-182.ec2.internal\" not found" node="ip-10-0-136-182.ec2.internal" Apr 17 16:31:11.106755 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:11.106726 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-182.ec2.internal\" not found" Apr 17 16:31:11.204893 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.204860 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/10780fa939abfc038e268d8b8ca64834-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal\" (UID: \"10780fa939abfc038e268d8b8ca64834\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal" Apr 17 16:31:11.204893 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.204895 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/10780fa939abfc038e268d8b8ca64834-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal\" (UID: \"10780fa939abfc038e268d8b8ca64834\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal" Apr 17 16:31:11.205044 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.204912 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8d86482bf3f913f02d52bdaa697da1d7-config\") pod \"kube-apiserver-proxy-ip-10-0-136-182.ec2.internal\" (UID: \"8d86482bf3f913f02d52bdaa697da1d7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-182.ec2.internal" Apr 17 16:31:11.206924 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:11.206909 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-182.ec2.internal\" not found" Apr 17 16:31:11.305608 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.305570 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/10780fa939abfc038e268d8b8ca64834-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal\" (UID: \"10780fa939abfc038e268d8b8ca64834\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal" Apr 17 16:31:11.305608 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.305613 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/10780fa939abfc038e268d8b8ca64834-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal\" (UID: \"10780fa939abfc038e268d8b8ca64834\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal" Apr 17 16:31:11.305754 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.305629 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8d86482bf3f913f02d52bdaa697da1d7-config\") pod \"kube-apiserver-proxy-ip-10-0-136-182.ec2.internal\" (UID: \"8d86482bf3f913f02d52bdaa697da1d7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-182.ec2.internal" Apr 17 16:31:11.305754 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.305672 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/10780fa939abfc038e268d8b8ca64834-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal\" (UID: \"10780fa939abfc038e268d8b8ca64834\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal" Apr 17 16:31:11.305754 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.305693 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8d86482bf3f913f02d52bdaa697da1d7-config\") pod \"kube-apiserver-proxy-ip-10-0-136-182.ec2.internal\" (UID: \"8d86482bf3f913f02d52bdaa697da1d7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-182.ec2.internal" Apr 17 16:31:11.305754 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.305711 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/10780fa939abfc038e268d8b8ca64834-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal\" (UID: \"10780fa939abfc038e268d8b8ca64834\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal" Apr 17 16:31:11.307681 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:11.307666 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-182.ec2.internal\" not found" Apr 17 16:31:11.333890 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.333863 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal" Apr 17 16:31:11.336611 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.336590 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-182.ec2.internal" Apr 17 16:31:11.408252 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:11.408220 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-182.ec2.internal\" not found" Apr 17 16:31:11.508837 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:11.508797 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-182.ec2.internal\" not found" Apr 17 16:31:11.609461 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:11.609380 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-182.ec2.internal\" not found" Apr 17 16:31:11.710056 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:11.710025 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-182.ec2.internal\" not found" Apr 17 16:31:11.710056 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.710028 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 16:31:11.710870 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.710185 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:31:11.803208 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.803188 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:11.807457 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:11.807432 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d86482bf3f913f02d52bdaa697da1d7.slice/crio-5b7648a95026a8af62a16c4db38acb19dbe9938efc0338274474ba2a0b7c2989 WatchSource:0}: Error finding container 5b7648a95026a8af62a16c4db38acb19dbe9938efc0338274474ba2a0b7c2989: Status 404 returned error can't find the container with id 5b7648a95026a8af62a16c4db38acb19dbe9938efc0338274474ba2a0b7c2989 Apr 17 16:31:11.807627 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:11.807608 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10780fa939abfc038e268d8b8ca64834.slice/crio-ab53692fac09db3a6371f7a7ed0a3bf46e81ae0233c057c365cccef4938afbc3 WatchSource:0}: Error finding container ab53692fac09db3a6371f7a7ed0a3bf46e81ae0233c057c365cccef4938afbc3: Status 404 returned error can't find the container with id ab53692fac09db3a6371f7a7ed0a3bf46e81ae0233c057c365cccef4938afbc3 Apr 17 16:31:11.810396 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:11.810374 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-182.ec2.internal\" not found" Apr 17 16:31:11.812055 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.812041 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:31:11.822413 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.822392 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:11.827366 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.827320 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 16:26:10 +0000 UTC" deadline="2027-12-15 04:58:54.936634692 +0000 UTC" Apr 17 16:31:11.827366 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.827365 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14556h27m43.10927295s" Apr 17 16:31:11.851675 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.851645 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-rr5wb" Apr 17 16:31:11.863373 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.863322 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-rr5wb" Apr 17 16:31:11.870278 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.870256 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:11.903161 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.903137 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal" Apr 17 16:31:11.912094 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.912038 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-182.ec2.internal" event={"ID":"8d86482bf3f913f02d52bdaa697da1d7","Type":"ContainerStarted","Data":"5b7648a95026a8af62a16c4db38acb19dbe9938efc0338274474ba2a0b7c2989"} Apr 17 16:31:11.912906 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.912885 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal" event={"ID":"10780fa939abfc038e268d8b8ca64834","Type":"ContainerStarted","Data":"ab53692fac09db3a6371f7a7ed0a3bf46e81ae0233c057c365cccef4938afbc3"} Apr 17 16:31:11.917698 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.917683 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:11.919546 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.919531 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-182.ec2.internal" Apr 17 16:31:11.926900 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:11.926884 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:12.112381 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.112354 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:12.126694 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.126620 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:12.784158 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.784123 2569 apiserver.go:52] "Watching apiserver" Apr 17 16:31:12.791663 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.791639 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 16:31:12.793729 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.793676 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-xcmwf","kube-system/kube-apiserver-proxy-ip-10-0-136-182.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86","openshift-dns/node-resolver-qskd2","openshift-image-registry/node-ca-sfzc4","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal","openshift-multus/multus-additional-cni-plugins-fvk4x","openshift-network-diagnostics/network-check-target-454rf","openshift-cluster-node-tuning-operator/tuned-h97kv","openshift-multus/multus-jdjbz","openshift-multus/network-metrics-daemon-njmgk","openshift-network-operator/iptables-alerter-qcf6j","openshift-ovn-kubernetes/ovnkube-node-2cjj2"] Apr 17 16:31:12.795228 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.795206 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xcmwf" Apr 17 16:31:12.796463 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.796433 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.797670 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.797628 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qskd2" Apr 17 16:31:12.798514 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.798492 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 16:31:12.798818 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.798784 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sfzc4" Apr 17 16:31:12.798818 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.798796 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 16:31:12.799010 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.798863 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 16:31:12.799063 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.799027 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 16:31:12.799117 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.799092 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7mm8l\"" Apr 17 16:31:12.799399 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.799355 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 16:31:12.799489 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.799432 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8nqx2\"" Apr 17 16:31:12.800229 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.800178 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 16:31:12.803013 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.800585 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-s6sps\"" Apr 17 16:31:12.803013 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.800667 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 16:31:12.803013 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.802265 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.803013 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.802956 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 16:31:12.803254 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.803153 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 16:31:12.803254 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.803166 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 16:31:12.803351 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.803305 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ps24l\"" Apr 17 16:31:12.804334 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.804041 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:12.804530 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:12.804487 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-454rf" podUID="cf2a6beb-fbfb-4062-b87b-a178033b242c" Apr 17 16:31:12.805127 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.805104 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-pvr62\"" Apr 17 16:31:12.805225 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.805109 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 16:31:12.805718 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.805413 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 16:31:12.805718 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.805446 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 16:31:12.805718 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.805461 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 16:31:12.805718 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.805492 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 16:31:12.806271 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.806252 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.806352 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.806336 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.808129 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.808108 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:12.808274 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:12.808177 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-njmgk" podUID="79adaa92-9fae-4abb-b9ae-335440dbe8f1" Apr 17 16:31:12.808351 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.808305 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7wkvc\"" Apr 17 16:31:12.808439 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.808416 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:12.808640 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.808619 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 16:31:12.808796 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.808785 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-m4s7h\"" Apr 17 16:31:12.808913 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.808880 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:12.809607 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.809589 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qcf6j" Apr 17 16:31:12.810856 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.810839 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.812443 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.812428 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 16:31:12.812531 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.812428 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-bsg2m\"" Apr 17 16:31:12.812749 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.812735 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:12.812948 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.812844 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 16:31:12.812948 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.812848 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:12.813501 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.813488 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 16:31:12.813817 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.813788 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 16:31:12.813900 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.813791 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 16:31:12.814065 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.814051 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 16:31:12.814325 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.814311 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-jhkwc\"" Apr 17 16:31:12.814428 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.814410 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 16:31:12.816159 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816141 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c3742c8-dbba-43ea-99fc-4321ab7b1156-host\") pod \"node-ca-sfzc4\" (UID: \"5c3742c8-dbba-43ea-99fc-4321ab7b1156\") " pod="openshift-image-registry/node-ca-sfzc4" Apr 17 16:31:12.816256 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816172 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-776bn\" (UniqueName: \"kubernetes.io/projected/4415d384-34d0-412a-8d4e-c8f3077b28f5-kube-api-access-776bn\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.816256 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816201 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swls8\" (UniqueName: \"kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8\") pod \"network-check-target-454rf\" (UID: \"cf2a6beb-fbfb-4062-b87b-a178033b242c\") " pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:12.816256 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816224 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c855efcb-8d09-4e04-8063-d5bb5ae67dc3-tmp-dir\") pod \"node-resolver-qskd2\" (UID: \"c855efcb-8d09-4e04-8063-d5bb5ae67dc3\") " pod="openshift-dns/node-resolver-qskd2" Apr 17 16:31:12.816471 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816290 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-sysconfig\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.816471 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816326 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-host-var-lib-cni-multus\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.816471 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816366 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-systemd\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.816471 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816400 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-run\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.816471 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816424 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-sys\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.816697 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816476 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/611a935a-b080-4fc7-bb9b-c51d000c8434-cni-binary-copy\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.816697 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816517 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/611a935a-b080-4fc7-bb9b-c51d000c8434-multus-daemon-config\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.816697 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816544 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4415d384-34d0-412a-8d4e-c8f3077b28f5-system-cni-dir\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.816697 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816600 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-host-run-k8s-cni-cncf-io\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.816697 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816629 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-host-var-lib-cni-bin\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.816697 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816655 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-hostroot\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.816697 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816679 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5c3742c8-dbba-43ea-99fc-4321ab7b1156-serviceca\") pod \"node-ca-sfzc4\" (UID: \"5c3742c8-dbba-43ea-99fc-4321ab7b1156\") " pod="openshift-image-registry/node-ca-sfzc4" Apr 17 16:31:12.817023 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816703 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4415d384-34d0-412a-8d4e-c8f3077b28f5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.817023 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816818 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-os-release\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.817023 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816854 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d56445ed-a7d8-44cf-b91d-6cd2dcb4d6bc-agent-certs\") pod \"konnectivity-agent-xcmwf\" (UID: \"d56445ed-a7d8-44cf-b91d-6cd2dcb4d6bc\") " pod="kube-system/konnectivity-agent-xcmwf" Apr 17 16:31:12.817023 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816876 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f63442f4-5899-41ba-b675-c0cb1c7837b8-registration-dir\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.817023 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816896 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-lib-modules\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.817023 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816919 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-cnibin\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.817023 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816949 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-host-run-netns\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.817023 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816974 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-sysctl-conf\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.817023 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.816998 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-var-lib-kubelet\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.817425 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817030 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-system-cni-dir\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.817425 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817050 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-modprobe-d\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.817425 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817098 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f63442f4-5899-41ba-b675-c0cb1c7837b8-device-dir\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.817425 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817122 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-sysctl-d\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.817425 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817144 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4415d384-34d0-412a-8d4e-c8f3077b28f5-os-release\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.817425 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817167 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4w6g\" (UniqueName: \"kubernetes.io/projected/c855efcb-8d09-4e04-8063-d5bb5ae67dc3-kube-api-access-d4w6g\") pod \"node-resolver-qskd2\" (UID: \"c855efcb-8d09-4e04-8063-d5bb5ae67dc3\") " pod="openshift-dns/node-resolver-qskd2" Apr 17 16:31:12.817425 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817204 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-multus-socket-dir-parent\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.817425 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817255 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f63442f4-5899-41ba-b675-c0cb1c7837b8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.817425 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817281 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f63442f4-5899-41ba-b675-c0cb1c7837b8-socket-dir\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.817425 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817321 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-tmp\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.817425 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817344 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-kubernetes\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.817425 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817369 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-host\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.817425 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817393 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-multus-conf-dir\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.817425 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817415 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4415d384-34d0-412a-8d4e-c8f3077b28f5-cni-binary-copy\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.818027 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817451 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4415d384-34d0-412a-8d4e-c8f3077b28f5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.818027 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817485 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4415d384-34d0-412a-8d4e-c8f3077b28f5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.818027 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817524 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f63442f4-5899-41ba-b675-c0cb1c7837b8-etc-selinux\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.818027 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817549 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f63442f4-5899-41ba-b675-c0cb1c7837b8-sys-fs\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.818027 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817572 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw5qb\" (UniqueName: \"kubernetes.io/projected/f63442f4-5899-41ba-b675-c0cb1c7837b8-kube-api-access-cw5qb\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.818027 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817600 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c855efcb-8d09-4e04-8063-d5bb5ae67dc3-hosts-file\") pod \"node-resolver-qskd2\" (UID: \"c855efcb-8d09-4e04-8063-d5bb5ae67dc3\") " pod="openshift-dns/node-resolver-qskd2" Apr 17 16:31:12.818027 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817623 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxl45\" (UniqueName: \"kubernetes.io/projected/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-kube-api-access-mxl45\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.818027 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817647 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-etc-kubernetes\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.818027 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817683 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc2wm\" (UniqueName: \"kubernetes.io/projected/611a935a-b080-4fc7-bb9b-c51d000c8434-kube-api-access-fc2wm\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.818027 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817721 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv9gc\" (UniqueName: \"kubernetes.io/projected/5c3742c8-dbba-43ea-99fc-4321ab7b1156-kube-api-access-pv9gc\") pod \"node-ca-sfzc4\" (UID: \"5c3742c8-dbba-43ea-99fc-4321ab7b1156\") " pod="openshift-image-registry/node-ca-sfzc4" Apr 17 16:31:12.818027 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817749 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4415d384-34d0-412a-8d4e-c8f3077b28f5-cnibin\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.818027 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817772 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-tuned\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.818027 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817795 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-multus-cni-dir\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.818027 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817838 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-host-var-lib-kubelet\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.818027 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817875 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d56445ed-a7d8-44cf-b91d-6cd2dcb4d6bc-konnectivity-ca\") pod \"konnectivity-agent-xcmwf\" (UID: \"d56445ed-a7d8-44cf-b91d-6cd2dcb4d6bc\") " pod="kube-system/konnectivity-agent-xcmwf" Apr 17 16:31:12.818027 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.817898 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-host-run-multus-certs\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.864892 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.864856 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:11 +0000 UTC" deadline="2027-11-16 09:44:58.375660873 +0000 UTC" Apr 17 16:31:12.864892 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.864893 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13865h13m45.510772356s" Apr 17 16:31:12.905854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.905824 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 16:31:12.918100 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918066 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-var-lib-kubelet\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.918266 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918109 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-system-cni-dir\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.918266 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918152 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-run-ovn\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.918266 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918169 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-modprobe-d\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.918266 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918194 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f63442f4-5899-41ba-b675-c0cb1c7837b8-device-dir\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.918266 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918199 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-var-lib-kubelet\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.918266 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918204 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-system-cni-dir\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.918266 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918220 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-sysctl-d\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.918266 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918259 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-cni-bin\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.918649 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918272 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f63442f4-5899-41ba-b675-c0cb1c7837b8-device-dir\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.918649 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918293 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4415d384-34d0-412a-8d4e-c8f3077b28f5-os-release\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.918649 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918323 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4w6g\" (UniqueName: \"kubernetes.io/projected/c855efcb-8d09-4e04-8063-d5bb5ae67dc3-kube-api-access-d4w6g\") pod \"node-resolver-qskd2\" (UID: \"c855efcb-8d09-4e04-8063-d5bb5ae67dc3\") " pod="openshift-dns/node-resolver-qskd2" Apr 17 16:31:12.918649 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918333 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-sysctl-d\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.918649 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918325 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-modprobe-d\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.918649 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918348 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-multus-socket-dir-parent\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.918649 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918376 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4415d384-34d0-412a-8d4e-c8f3077b28f5-os-release\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.918649 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918380 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-log-socket\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.918649 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918394 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-multus-socket-dir-parent\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.918649 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918423 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56d56159-cb46-4e72-b5a0-a94c8cc6452d-ovnkube-config\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.918649 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918453 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f63442f4-5899-41ba-b675-c0cb1c7837b8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.918649 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918477 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f63442f4-5899-41ba-b675-c0cb1c7837b8-socket-dir\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.918649 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918524 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f63442f4-5899-41ba-b675-c0cb1c7837b8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.918649 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918555 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-tmp\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.918649 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918578 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-kubelet\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.918649 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918604 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56d56159-cb46-4e72-b5a0-a94c8cc6452d-ovn-node-metrics-cert\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.918649 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918578 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f63442f4-5899-41ba-b675-c0cb1c7837b8-socket-dir\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.919407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918632 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-kubernetes\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.919407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918654 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-host\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.919407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918679 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-multus-conf-dir\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.919407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918692 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-kubernetes\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.919407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918704 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7n4g\" (UniqueName: \"kubernetes.io/projected/56d56159-cb46-4e72-b5a0-a94c8cc6452d-kube-api-access-m7n4g\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.919407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918728 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-host\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.919407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918731 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4415d384-34d0-412a-8d4e-c8f3077b28f5-cni-binary-copy\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.919407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918773 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4415d384-34d0-412a-8d4e-c8f3077b28f5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.919407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918839 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4415d384-34d0-412a-8d4e-c8f3077b28f5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.919407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918772 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-multus-conf-dir\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.919407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918869 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f63442f4-5899-41ba-b675-c0cb1c7837b8-etc-selinux\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.919407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918904 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 16:31:12.919407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918919 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f63442f4-5899-41ba-b675-c0cb1c7837b8-etc-selinux\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.919407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918929 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f63442f4-5899-41ba-b675-c0cb1c7837b8-sys-fs\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.919407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918960 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cw5qb\" (UniqueName: \"kubernetes.io/projected/f63442f4-5899-41ba-b675-c0cb1c7837b8-kube-api-access-cw5qb\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.919407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.918987 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c855efcb-8d09-4e04-8063-d5bb5ae67dc3-hosts-file\") pod \"node-resolver-qskd2\" (UID: \"c855efcb-8d09-4e04-8063-d5bb5ae67dc3\") " pod="openshift-dns/node-resolver-qskd2" Apr 17 16:31:12.919407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919014 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxl45\" (UniqueName: \"kubernetes.io/projected/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-kube-api-access-mxl45\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.919944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919040 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-etc-kubernetes\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.919944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919060 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fc2wm\" (UniqueName: \"kubernetes.io/projected/611a935a-b080-4fc7-bb9b-c51d000c8434-kube-api-access-fc2wm\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.919944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919084 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs\") pod \"network-metrics-daemon-njmgk\" (UID: \"79adaa92-9fae-4abb-b9ae-335440dbe8f1\") " pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:12.919944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919139 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-systemd-units\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.919944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919165 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pv9gc\" (UniqueName: \"kubernetes.io/projected/5c3742c8-dbba-43ea-99fc-4321ab7b1156-kube-api-access-pv9gc\") pod \"node-ca-sfzc4\" (UID: \"5c3742c8-dbba-43ea-99fc-4321ab7b1156\") " pod="openshift-image-registry/node-ca-sfzc4" Apr 17 16:31:12.919944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919194 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4415d384-34d0-412a-8d4e-c8f3077b28f5-cnibin\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.919944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919211 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-tuned\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.919944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919232 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-multus-cni-dir\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.919944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919252 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-host-var-lib-kubelet\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.919944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919268 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d56445ed-a7d8-44cf-b91d-6cd2dcb4d6bc-konnectivity-ca\") pod \"konnectivity-agent-xcmwf\" (UID: \"d56445ed-a7d8-44cf-b91d-6cd2dcb4d6bc\") " pod="kube-system/konnectivity-agent-xcmwf" Apr 17 16:31:12.919944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919284 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-cni-netd\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.919944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919302 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/56d56159-cb46-4e72-b5a0-a94c8cc6452d-ovnkube-script-lib\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.919944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919329 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-host-run-multus-certs\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.919944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919327 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4415d384-34d0-412a-8d4e-c8f3077b28f5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.919944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919373 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4415d384-34d0-412a-8d4e-c8f3077b28f5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.919944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919436 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f63442f4-5899-41ba-b675-c0cb1c7837b8-sys-fs\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.919944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919481 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4415d384-34d0-412a-8d4e-c8f3077b28f5-cnibin\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.920721 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919727 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4415d384-34d0-412a-8d4e-c8f3077b28f5-cni-binary-copy\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.920721 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919743 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c855efcb-8d09-4e04-8063-d5bb5ae67dc3-hosts-file\") pod \"node-resolver-qskd2\" (UID: \"c855efcb-8d09-4e04-8063-d5bb5ae67dc3\") " pod="openshift-dns/node-resolver-qskd2" Apr 17 16:31:12.920721 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919771 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-slash\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.920721 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919816 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-run-systemd\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.920721 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919843 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c3742c8-dbba-43ea-99fc-4321ab7b1156-host\") pod \"node-ca-sfzc4\" (UID: \"5c3742c8-dbba-43ea-99fc-4321ab7b1156\") " pod="openshift-image-registry/node-ca-sfzc4" Apr 17 16:31:12.920721 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919871 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-776bn\" (UniqueName: \"kubernetes.io/projected/4415d384-34d0-412a-8d4e-c8f3077b28f5-kube-api-access-776bn\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.920721 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919901 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swls8\" (UniqueName: \"kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8\") pod \"network-check-target-454rf\" (UID: \"cf2a6beb-fbfb-4062-b87b-a178033b242c\") " pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:12.920721 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919930 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d70a0068-8f05-43f8-965d-fae1fcc76d7c-host-slash\") pod \"iptables-alerter-qcf6j\" (UID: \"d70a0068-8f05-43f8-965d-fae1fcc76d7c\") " pod="openshift-network-operator/iptables-alerter-qcf6j" Apr 17 16:31:12.920721 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919956 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj6lr\" (UniqueName: \"kubernetes.io/projected/d70a0068-8f05-43f8-965d-fae1fcc76d7c-kube-api-access-wj6lr\") pod \"iptables-alerter-qcf6j\" (UID: \"d70a0068-8f05-43f8-965d-fae1fcc76d7c\") " pod="openshift-network-operator/iptables-alerter-qcf6j" Apr 17 16:31:12.920721 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919981 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbb7j\" (UniqueName: \"kubernetes.io/projected/79adaa92-9fae-4abb-b9ae-335440dbe8f1-kube-api-access-pbb7j\") pod \"network-metrics-daemon-njmgk\" (UID: \"79adaa92-9fae-4abb-b9ae-335440dbe8f1\") " pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:12.920721 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.919983 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-etc-kubernetes\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.920721 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920011 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c855efcb-8d09-4e04-8063-d5bb5ae67dc3-tmp-dir\") pod \"node-resolver-qskd2\" (UID: \"c855efcb-8d09-4e04-8063-d5bb5ae67dc3\") " pod="openshift-dns/node-resolver-qskd2" Apr 17 16:31:12.920721 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920017 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d56445ed-a7d8-44cf-b91d-6cd2dcb4d6bc-konnectivity-ca\") pod \"konnectivity-agent-xcmwf\" (UID: \"d56445ed-a7d8-44cf-b91d-6cd2dcb4d6bc\") " pod="kube-system/konnectivity-agent-xcmwf" Apr 17 16:31:12.920721 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920037 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-sysconfig\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.920721 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920062 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-host-var-lib-cni-multus\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.920721 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920093 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-multus-cni-dir\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.920721 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920098 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d70a0068-8f05-43f8-965d-fae1fcc76d7c-iptables-alerter-script\") pod \"iptables-alerter-qcf6j\" (UID: \"d70a0068-8f05-43f8-965d-fae1fcc76d7c\") " pod="openshift-network-operator/iptables-alerter-qcf6j" Apr 17 16:31:12.921523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920123 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-run-openvswitch\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.921523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920139 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-host-var-lib-kubelet\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.921523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920150 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.921523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920219 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-systemd\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.921523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920252 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-run\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.921523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920317 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-run\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.921523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920317 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-systemd\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.921523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920354 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-sys\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.921523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920366 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c3742c8-dbba-43ea-99fc-4321ab7b1156-host\") pod \"node-ca-sfzc4\" (UID: \"5c3742c8-dbba-43ea-99fc-4321ab7b1156\") " pod="openshift-image-registry/node-ca-sfzc4" Apr 17 16:31:12.921523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920384 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/611a935a-b080-4fc7-bb9b-c51d000c8434-cni-binary-copy\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.921523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920414 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-node-log\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.921523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920415 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-host-run-multus-certs\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.921523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920481 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-sysconfig\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.921523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920503 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-host-var-lib-cni-multus\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.921523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920533 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/611a935a-b080-4fc7-bb9b-c51d000c8434-multus-daemon-config\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.921523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920588 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-sys\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.921523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920620 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4415d384-34d0-412a-8d4e-c8f3077b28f5-system-cni-dir\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.921523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920656 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-host-run-k8s-cni-cncf-io\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.922342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920659 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c855efcb-8d09-4e04-8063-d5bb5ae67dc3-tmp-dir\") pod \"node-resolver-qskd2\" (UID: \"c855efcb-8d09-4e04-8063-d5bb5ae67dc3\") " pod="openshift-dns/node-resolver-qskd2" Apr 17 16:31:12.922342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920681 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-host-var-lib-cni-bin\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.922342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920724 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-hostroot\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.922342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920730 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-host-var-lib-cni-bin\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.922342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920757 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-host-run-k8s-cni-cncf-io\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.922342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920782 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-hostroot\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.922342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920794 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4415d384-34d0-412a-8d4e-c8f3077b28f5-system-cni-dir\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.922342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920835 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-run-netns\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.922342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920867 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-var-lib-openvswitch\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.922342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920892 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-etc-openvswitch\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.922342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920930 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5c3742c8-dbba-43ea-99fc-4321ab7b1156-serviceca\") pod \"node-ca-sfzc4\" (UID: \"5c3742c8-dbba-43ea-99fc-4321ab7b1156\") " pod="openshift-image-registry/node-ca-sfzc4" Apr 17 16:31:12.922342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920933 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/611a935a-b080-4fc7-bb9b-c51d000c8434-cni-binary-copy\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.922342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.920959 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4415d384-34d0-412a-8d4e-c8f3077b28f5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.922342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.921049 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-os-release\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.922342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.921076 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d56445ed-a7d8-44cf-b91d-6cd2dcb4d6bc-agent-certs\") pod \"konnectivity-agent-xcmwf\" (UID: \"d56445ed-a7d8-44cf-b91d-6cd2dcb4d6bc\") " pod="kube-system/konnectivity-agent-xcmwf" Apr 17 16:31:12.922342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.921075 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/611a935a-b080-4fc7-bb9b-c51d000c8434-multus-daemon-config\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.922342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.921133 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56d56159-cb46-4e72-b5a0-a94c8cc6452d-env-overrides\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.922342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.921178 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f63442f4-5899-41ba-b675-c0cb1c7837b8-registration-dir\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.923076 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.921189 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-os-release\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.923076 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.921205 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-lib-modules\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.923076 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.921254 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-cnibin\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.923076 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.921297 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-host-run-netns\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.923076 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.921324 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-run-ovn-kubernetes\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:12.923076 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.921339 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5c3742c8-dbba-43ea-99fc-4321ab7b1156-serviceca\") pod \"node-ca-sfzc4\" (UID: \"5c3742c8-dbba-43ea-99fc-4321ab7b1156\") " pod="openshift-image-registry/node-ca-sfzc4" Apr 17 16:31:12.923076 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.921412 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-cnibin\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.923076 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.921417 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-lib-modules\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.923076 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.921456 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/611a935a-b080-4fc7-bb9b-c51d000c8434-host-run-netns\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.923076 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.921467 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f63442f4-5899-41ba-b675-c0cb1c7837b8-registration-dir\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.923076 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.921538 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-sysctl-conf\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.923076 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.921737 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-sysctl-conf\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.923076 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.922141 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4415d384-34d0-412a-8d4e-c8f3077b28f5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.923076 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.922395 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-tmp\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.923076 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.922502 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-etc-tuned\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:12.923619 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.923600 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d56445ed-a7d8-44cf-b91d-6cd2dcb4d6bc-agent-certs\") pod \"konnectivity-agent-xcmwf\" (UID: \"d56445ed-a7d8-44cf-b91d-6cd2dcb4d6bc\") " pod="kube-system/konnectivity-agent-xcmwf" Apr 17 16:31:12.929449 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:12.929424 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:12.929567 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:12.929453 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:12.929567 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:12.929466 2569 projected.go:194] Error preparing data for projected volume kube-api-access-swls8 for pod openshift-network-diagnostics/network-check-target-454rf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:12.929567 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:12.929560 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8 podName:cf2a6beb-fbfb-4062-b87b-a178033b242c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:13.429527907 +0000 UTC m=+3.086335107 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-swls8" (UniqueName: "kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8") pod "network-check-target-454rf" (UID: "cf2a6beb-fbfb-4062-b87b-a178033b242c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:12.931741 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.931712 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4w6g\" (UniqueName: \"kubernetes.io/projected/c855efcb-8d09-4e04-8063-d5bb5ae67dc3-kube-api-access-d4w6g\") pod \"node-resolver-qskd2\" (UID: \"c855efcb-8d09-4e04-8063-d5bb5ae67dc3\") " pod="openshift-dns/node-resolver-qskd2" Apr 17 16:31:12.932158 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.932129 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw5qb\" (UniqueName: \"kubernetes.io/projected/f63442f4-5899-41ba-b675-c0cb1c7837b8-kube-api-access-cw5qb\") pod \"aws-ebs-csi-driver-node-2lm86\" (UID: \"f63442f4-5899-41ba-b675-c0cb1c7837b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:12.932590 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.932567 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv9gc\" (UniqueName: \"kubernetes.io/projected/5c3742c8-dbba-43ea-99fc-4321ab7b1156-kube-api-access-pv9gc\") pod \"node-ca-sfzc4\" (UID: \"5c3742c8-dbba-43ea-99fc-4321ab7b1156\") " pod="openshift-image-registry/node-ca-sfzc4" Apr 17 16:31:12.932700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.932657 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-776bn\" (UniqueName: \"kubernetes.io/projected/4415d384-34d0-412a-8d4e-c8f3077b28f5-kube-api-access-776bn\") pod \"multus-additional-cni-plugins-fvk4x\" (UID: \"4415d384-34d0-412a-8d4e-c8f3077b28f5\") " pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:12.932700 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.932668 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc2wm\" (UniqueName: \"kubernetes.io/projected/611a935a-b080-4fc7-bb9b-c51d000c8434-kube-api-access-fc2wm\") pod \"multus-jdjbz\" (UID: \"611a935a-b080-4fc7-bb9b-c51d000c8434\") " pod="openshift-multus/multus-jdjbz" Apr 17 16:31:12.933333 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:12.933312 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxl45\" (UniqueName: \"kubernetes.io/projected/5fc75060-d6e4-4b7a-bba7-b763cd3b68a1-kube-api-access-mxl45\") pod \"tuned-h97kv\" (UID: \"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1\") " pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:13.022322 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022287 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs\") pod \"network-metrics-daemon-njmgk\" (UID: \"79adaa92-9fae-4abb-b9ae-335440dbe8f1\") " pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:13.022322 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022322 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-systemd-units\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.022549 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022346 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-cni-netd\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.022549 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:13.022433 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:13.022549 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022433 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-systemd-units\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.022549 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022468 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/56d56159-cb46-4e72-b5a0-a94c8cc6452d-ovnkube-script-lib\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.022549 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022483 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-cni-netd\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.022549 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:13.022508 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs podName:79adaa92-9fae-4abb-b9ae-335440dbe8f1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:13.522488977 +0000 UTC m=+3.179296186 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs") pod "network-metrics-daemon-njmgk" (UID: "79adaa92-9fae-4abb-b9ae-335440dbe8f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:13.022883 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022574 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-slash\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.022883 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022605 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-run-systemd\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.022883 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022652 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d70a0068-8f05-43f8-965d-fae1fcc76d7c-host-slash\") pod \"iptables-alerter-qcf6j\" (UID: \"d70a0068-8f05-43f8-965d-fae1fcc76d7c\") " pod="openshift-network-operator/iptables-alerter-qcf6j" Apr 17 16:31:13.022883 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022676 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wj6lr\" (UniqueName: \"kubernetes.io/projected/d70a0068-8f05-43f8-965d-fae1fcc76d7c-kube-api-access-wj6lr\") pod \"iptables-alerter-qcf6j\" (UID: \"d70a0068-8f05-43f8-965d-fae1fcc76d7c\") " pod="openshift-network-operator/iptables-alerter-qcf6j" Apr 17 16:31:13.022883 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022692 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbb7j\" (UniqueName: \"kubernetes.io/projected/79adaa92-9fae-4abb-b9ae-335440dbe8f1-kube-api-access-pbb7j\") pod \"network-metrics-daemon-njmgk\" (UID: \"79adaa92-9fae-4abb-b9ae-335440dbe8f1\") " pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:13.022883 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022696 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-slash\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.022883 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022710 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d70a0068-8f05-43f8-965d-fae1fcc76d7c-iptables-alerter-script\") pod \"iptables-alerter-qcf6j\" (UID: \"d70a0068-8f05-43f8-965d-fae1fcc76d7c\") " pod="openshift-network-operator/iptables-alerter-qcf6j" Apr 17 16:31:13.022883 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022740 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-run-openvswitch\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.022883 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022750 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d70a0068-8f05-43f8-965d-fae1fcc76d7c-host-slash\") pod \"iptables-alerter-qcf6j\" (UID: \"d70a0068-8f05-43f8-965d-fae1fcc76d7c\") " pod="openshift-network-operator/iptables-alerter-qcf6j" Apr 17 16:31:13.022883 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022775 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.022883 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022797 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-run-systemd\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.022883 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022818 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-node-log\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.022883 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022858 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-run-netns\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.022883 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022874 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-var-lib-openvswitch\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.022883 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022889 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-etc-openvswitch\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.023531 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022916 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56d56159-cb46-4e72-b5a0-a94c8cc6452d-env-overrides\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.023531 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022944 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-run-ovn-kubernetes\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.023531 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022972 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-run-ovn\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.023531 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022994 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-cni-bin\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.023531 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.023000 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-var-lib-openvswitch\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.023531 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.023014 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-log-socket\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.023531 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.023041 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56d56159-cb46-4e72-b5a0-a94c8cc6452d-ovnkube-config\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.023531 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.023050 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-node-log\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.023531 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.023083 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-kubelet\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.023531 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.023112 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56d56159-cb46-4e72-b5a0-a94c8cc6452d-ovn-node-metrics-cert\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.023531 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.023140 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7n4g\" (UniqueName: \"kubernetes.io/projected/56d56159-cb46-4e72-b5a0-a94c8cc6452d-kube-api-access-m7n4g\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.023531 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.023230 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/56d56159-cb46-4e72-b5a0-a94c8cc6452d-ovnkube-script-lib\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.023531 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.023293 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-run-ovn-kubernetes\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.023531 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022887 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.023531 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.022858 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-run-openvswitch\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.023531 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.023341 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-etc-openvswitch\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.023531 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.023355 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d70a0068-8f05-43f8-965d-fae1fcc76d7c-iptables-alerter-script\") pod \"iptables-alerter-qcf6j\" (UID: \"d70a0068-8f05-43f8-965d-fae1fcc76d7c\") " pod="openshift-network-operator/iptables-alerter-qcf6j" Apr 17 16:31:13.024182 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.023378 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-run-netns\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.024182 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.023401 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-cni-bin\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.024182 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.023419 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-host-kubelet\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.024182 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.023440 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-run-ovn\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.024182 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.023468 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/56d56159-cb46-4e72-b5a0-a94c8cc6452d-log-socket\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.024182 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.023565 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/56d56159-cb46-4e72-b5a0-a94c8cc6452d-ovnkube-config\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.024182 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.023646 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/56d56159-cb46-4e72-b5a0-a94c8cc6452d-env-overrides\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.025758 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.025738 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/56d56159-cb46-4e72-b5a0-a94c8cc6452d-ovn-node-metrics-cert\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.032062 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.032044 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7n4g\" (UniqueName: \"kubernetes.io/projected/56d56159-cb46-4e72-b5a0-a94c8cc6452d-kube-api-access-m7n4g\") pod \"ovnkube-node-2cjj2\" (UID: \"56d56159-cb46-4e72-b5a0-a94c8cc6452d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.032407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.032390 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj6lr\" (UniqueName: \"kubernetes.io/projected/d70a0068-8f05-43f8-965d-fae1fcc76d7c-kube-api-access-wj6lr\") pod \"iptables-alerter-qcf6j\" (UID: \"d70a0068-8f05-43f8-965d-fae1fcc76d7c\") " pod="openshift-network-operator/iptables-alerter-qcf6j" Apr 17 16:31:13.032476 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.032424 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbb7j\" (UniqueName: \"kubernetes.io/projected/79adaa92-9fae-4abb-b9ae-335440dbe8f1-kube-api-access-pbb7j\") pod \"network-metrics-daemon-njmgk\" (UID: \"79adaa92-9fae-4abb-b9ae-335440dbe8f1\") " pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:13.107482 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.107404 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xcmwf" Apr 17 16:31:13.116217 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.116192 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" Apr 17 16:31:13.126050 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.126022 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qskd2" Apr 17 16:31:13.131016 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.130996 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sfzc4" Apr 17 16:31:13.137595 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.137573 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fvk4x" Apr 17 16:31:13.144226 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.144205 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-h97kv" Apr 17 16:31:13.148856 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.148837 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jdjbz" Apr 17 16:31:13.155394 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.155376 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qcf6j" Apr 17 16:31:13.161962 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.161939 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:13.239038 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.239007 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:13.400581 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:13.400553 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd70a0068_8f05_43f8_965d_fae1fcc76d7c.slice/crio-0c05c6b809835117b1750911e48b8ebb4908596efecb5361f61048f9fc4e5316 WatchSource:0}: Error finding container 0c05c6b809835117b1750911e48b8ebb4908596efecb5361f61048f9fc4e5316: Status 404 returned error can't find the container with id 0c05c6b809835117b1750911e48b8ebb4908596efecb5361f61048f9fc4e5316 Apr 17 16:31:13.407001 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:13.406974 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c3742c8_dbba_43ea_99fc_4321ab7b1156.slice/crio-d5d1752fcddb9c4ccc42bde8d58183298b3c56bdfbc59e156a462f1c5c874a97 WatchSource:0}: Error finding container d5d1752fcddb9c4ccc42bde8d58183298b3c56bdfbc59e156a462f1c5c874a97: Status 404 returned error can't find the container with id d5d1752fcddb9c4ccc42bde8d58183298b3c56bdfbc59e156a462f1c5c874a97 Apr 17 16:31:13.408209 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:13.408132 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4415d384_34d0_412a_8d4e_c8f3077b28f5.slice/crio-8515e8fb75ee33de0deb0a02a51c57fb6b94d5f3f9fff8e7851c5de460aec532 WatchSource:0}: Error finding container 8515e8fb75ee33de0deb0a02a51c57fb6b94d5f3f9fff8e7851c5de460aec532: Status 404 returned error can't find the container with id 8515e8fb75ee33de0deb0a02a51c57fb6b94d5f3f9fff8e7851c5de460aec532 Apr 17 16:31:13.408647 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:13.408624 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc855efcb_8d09_4e04_8063_d5bb5ae67dc3.slice/crio-6057e5c834050e6e9c4a57ae6b3833d139b3b1008763c21af2c8457e9091984a WatchSource:0}: Error finding container 6057e5c834050e6e9c4a57ae6b3833d139b3b1008763c21af2c8457e9091984a: Status 404 returned error can't find the container with id 6057e5c834050e6e9c4a57ae6b3833d139b3b1008763c21af2c8457e9091984a Apr 17 16:31:13.409730 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:13.409709 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf63442f4_5899_41ba_b675_c0cb1c7837b8.slice/crio-a60fc2b1011d8533cdf15e72fa4f7d4f8fdebfe89f58e89b53c591700ca2c81b WatchSource:0}: Error finding container a60fc2b1011d8533cdf15e72fa4f7d4f8fdebfe89f58e89b53c591700ca2c81b: Status 404 returned error can't find the container with id a60fc2b1011d8533cdf15e72fa4f7d4f8fdebfe89f58e89b53c591700ca2c81b Apr 17 16:31:13.411817 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:13.411782 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod611a935a_b080_4fc7_bb9b_c51d000c8434.slice/crio-43cc7e87375348eed17c80329b3c26722d16a06b9f32430e0796815ec0b2ecdb WatchSource:0}: Error finding container 43cc7e87375348eed17c80329b3c26722d16a06b9f32430e0796815ec0b2ecdb: Status 404 returned error can't find the container with id 43cc7e87375348eed17c80329b3c26722d16a06b9f32430e0796815ec0b2ecdb Apr 17 16:31:13.414984 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:13.414797 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd56445ed_a7d8_44cf_b91d_6cd2dcb4d6bc.slice/crio-ad18d247e7f5782c3433bd4add744b5e4ae88ff552379db688e009cb2e542681 WatchSource:0}: Error finding container ad18d247e7f5782c3433bd4add744b5e4ae88ff552379db688e009cb2e542681: Status 404 returned error can't find the container with id ad18d247e7f5782c3433bd4add744b5e4ae88ff552379db688e009cb2e542681 Apr 17 16:31:13.416365 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:13.416296 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fc75060_d6e4_4b7a_bba7_b763cd3b68a1.slice/crio-58a9531a0d1bf1f16bc264c696dc73b7cd731585500151115c20b8a547743b9a WatchSource:0}: Error finding container 58a9531a0d1bf1f16bc264c696dc73b7cd731585500151115c20b8a547743b9a: Status 404 returned error can't find the container with id 58a9531a0d1bf1f16bc264c696dc73b7cd731585500151115c20b8a547743b9a Apr 17 16:31:13.527313 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.527145 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs\") pod \"network-metrics-daemon-njmgk\" (UID: \"79adaa92-9fae-4abb-b9ae-335440dbe8f1\") " pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:13.527466 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.527326 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swls8\" (UniqueName: \"kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8\") pod \"network-check-target-454rf\" (UID: \"cf2a6beb-fbfb-4062-b87b-a178033b242c\") " pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:13.527466 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:13.527283 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:13.527466 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:13.527417 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:13.527466 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:13.527428 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:13.527466 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:13.527437 2569 projected.go:194] Error preparing data for projected volume kube-api-access-swls8 for pod openshift-network-diagnostics/network-check-target-454rf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:13.527466 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:13.527445 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs podName:79adaa92-9fae-4abb-b9ae-335440dbe8f1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:14.527424553 +0000 UTC m=+4.184231761 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs") pod "network-metrics-daemon-njmgk" (UID: "79adaa92-9fae-4abb-b9ae-335440dbe8f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:13.527762 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:13.527475 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8 podName:cf2a6beb-fbfb-4062-b87b-a178033b242c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:14.52746452 +0000 UTC m=+4.184271706 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-swls8" (UniqueName: "kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8") pod "network-check-target-454rf" (UID: "cf2a6beb-fbfb-4062-b87b-a178033b242c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:13.650268 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.650172 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-5vcm7"] Apr 17 16:31:13.651912 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.651894 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:13.652003 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:13.651957 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5vcm7" podUID="63fffba4-9c72-4652-811d-ebe876a5f73b" Apr 17 16:31:13.728754 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.728719 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret\") pod \"global-pull-secret-syncer-5vcm7\" (UID: \"63fffba4-9c72-4652-811d-ebe876a5f73b\") " pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:13.728920 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.728784 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/63fffba4-9c72-4652-811d-ebe876a5f73b-kubelet-config\") pod \"global-pull-secret-syncer-5vcm7\" (UID: \"63fffba4-9c72-4652-811d-ebe876a5f73b\") " pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:13.728920 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.728839 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/63fffba4-9c72-4652-811d-ebe876a5f73b-dbus\") pod \"global-pull-secret-syncer-5vcm7\" (UID: \"63fffba4-9c72-4652-811d-ebe876a5f73b\") " pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:13.829393 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.829355 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret\") pod \"global-pull-secret-syncer-5vcm7\" (UID: \"63fffba4-9c72-4652-811d-ebe876a5f73b\") " pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:13.829849 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.829450 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/63fffba4-9c72-4652-811d-ebe876a5f73b-kubelet-config\") pod \"global-pull-secret-syncer-5vcm7\" (UID: \"63fffba4-9c72-4652-811d-ebe876a5f73b\") " pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:13.829849 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.829502 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/63fffba4-9c72-4652-811d-ebe876a5f73b-dbus\") pod \"global-pull-secret-syncer-5vcm7\" (UID: \"63fffba4-9c72-4652-811d-ebe876a5f73b\") " pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:13.829849 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:13.829503 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:13.829849 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:13.829579 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret podName:63fffba4-9c72-4652-811d-ebe876a5f73b nodeName:}" failed. No retries permitted until 2026-04-17 16:31:14.329557955 +0000 UTC m=+3.986365160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret") pod "global-pull-secret-syncer-5vcm7" (UID: "63fffba4-9c72-4652-811d-ebe876a5f73b") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:13.829849 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.829574 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/63fffba4-9c72-4652-811d-ebe876a5f73b-kubelet-config\") pod \"global-pull-secret-syncer-5vcm7\" (UID: \"63fffba4-9c72-4652-811d-ebe876a5f73b\") " pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:13.829849 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.829661 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/63fffba4-9c72-4652-811d-ebe876a5f73b-dbus\") pod \"global-pull-secret-syncer-5vcm7\" (UID: \"63fffba4-9c72-4652-811d-ebe876a5f73b\") " pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:13.865114 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.865059 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:11 +0000 UTC" deadline="2027-10-17 11:03:20.31777311 +0000 UTC" Apr 17 16:31:13.865114 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.865101 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13146h32m6.452675557s" Apr 17 16:31:13.917875 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.917339 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qskd2" event={"ID":"c855efcb-8d09-4e04-8063-d5bb5ae67dc3","Type":"ContainerStarted","Data":"6057e5c834050e6e9c4a57ae6b3833d139b3b1008763c21af2c8457e9091984a"} Apr 17 16:31:13.920544 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.920516 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fvk4x" event={"ID":"4415d384-34d0-412a-8d4e-c8f3077b28f5","Type":"ContainerStarted","Data":"8515e8fb75ee33de0deb0a02a51c57fb6b94d5f3f9fff8e7851c5de460aec532"} Apr 17 16:31:13.923467 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.923047 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sfzc4" event={"ID":"5c3742c8-dbba-43ea-99fc-4321ab7b1156","Type":"ContainerStarted","Data":"d5d1752fcddb9c4ccc42bde8d58183298b3c56bdfbc59e156a462f1c5c874a97"} Apr 17 16:31:13.925462 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.925436 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" event={"ID":"56d56159-cb46-4e72-b5a0-a94c8cc6452d","Type":"ContainerStarted","Data":"7057b6a8911d643489c44c6c029c9fb9bf95ef87791584dcc26c0694fc441b2a"} Apr 17 16:31:13.931933 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.931906 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-h97kv" event={"ID":"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1","Type":"ContainerStarted","Data":"58a9531a0d1bf1f16bc264c696dc73b7cd731585500151115c20b8a547743b9a"} Apr 17 16:31:13.937913 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.937849 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xcmwf" event={"ID":"d56445ed-a7d8-44cf-b91d-6cd2dcb4d6bc","Type":"ContainerStarted","Data":"ad18d247e7f5782c3433bd4add744b5e4ae88ff552379db688e009cb2e542681"} Apr 17 16:31:13.947314 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.947284 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jdjbz" event={"ID":"611a935a-b080-4fc7-bb9b-c51d000c8434","Type":"ContainerStarted","Data":"43cc7e87375348eed17c80329b3c26722d16a06b9f32430e0796815ec0b2ecdb"} Apr 17 16:31:13.950784 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.950757 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qcf6j" event={"ID":"d70a0068-8f05-43f8-965d-fae1fcc76d7c","Type":"ContainerStarted","Data":"0c05c6b809835117b1750911e48b8ebb4908596efecb5361f61048f9fc4e5316"} Apr 17 16:31:13.959524 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.959495 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-182.ec2.internal" event={"ID":"8d86482bf3f913f02d52bdaa697da1d7","Type":"ContainerStarted","Data":"711534830f0a0d07feca9aae27f57704292e206891ceb8d1b58c5d32f7c50a1e"} Apr 17 16:31:13.974356 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.974272 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" event={"ID":"f63442f4-5899-41ba-b675-c0cb1c7837b8","Type":"ContainerStarted","Data":"a60fc2b1011d8533cdf15e72fa4f7d4f8fdebfe89f58e89b53c591700ca2c81b"} Apr 17 16:31:13.975877 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:13.975271 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-182.ec2.internal" podStartSLOduration=2.975255454 podStartE2EDuration="2.975255454s" podCreationTimestamp="2026-04-17 16:31:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:13.974985738 +0000 UTC m=+3.631792941" watchObservedRunningTime="2026-04-17 16:31:13.975255454 +0000 UTC m=+3.632062665" Apr 17 16:31:14.334745 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:14.334706 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret\") pod \"global-pull-secret-syncer-5vcm7\" (UID: \"63fffba4-9c72-4652-811d-ebe876a5f73b\") " pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:14.334941 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:14.334895 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:14.335000 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:14.334961 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret podName:63fffba4-9c72-4652-811d-ebe876a5f73b nodeName:}" failed. No retries permitted until 2026-04-17 16:31:15.334942887 +0000 UTC m=+4.991750086 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret") pod "global-pull-secret-syncer-5vcm7" (UID: "63fffba4-9c72-4652-811d-ebe876a5f73b") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:14.536226 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:14.536191 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs\") pod \"network-metrics-daemon-njmgk\" (UID: \"79adaa92-9fae-4abb-b9ae-335440dbe8f1\") " pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:14.536412 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:14.536249 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swls8\" (UniqueName: \"kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8\") pod \"network-check-target-454rf\" (UID: \"cf2a6beb-fbfb-4062-b87b-a178033b242c\") " pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:14.536412 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:14.536384 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:14.536412 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:14.536403 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:14.536571 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:14.536417 2569 projected.go:194] Error preparing data for projected volume kube-api-access-swls8 for pod openshift-network-diagnostics/network-check-target-454rf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:14.536571 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:14.536472 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8 podName:cf2a6beb-fbfb-4062-b87b-a178033b242c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:16.536455278 +0000 UTC m=+6.193262467 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-swls8" (UniqueName: "kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8") pod "network-check-target-454rf" (UID: "cf2a6beb-fbfb-4062-b87b-a178033b242c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:14.536994 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:14.536893 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:14.536994 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:14.536945 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs podName:79adaa92-9fae-4abb-b9ae-335440dbe8f1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:16.536930655 +0000 UTC m=+6.193737854 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs") pod "network-metrics-daemon-njmgk" (UID: "79adaa92-9fae-4abb-b9ae-335440dbe8f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:14.909995 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:14.909919 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:14.910448 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:14.910043 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5vcm7" podUID="63fffba4-9c72-4652-811d-ebe876a5f73b" Apr 17 16:31:14.910521 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:14.910482 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:14.910647 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:14.910591 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-njmgk" podUID="79adaa92-9fae-4abb-b9ae-335440dbe8f1" Apr 17 16:31:14.910740 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:14.910697 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:14.910791 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:14.910770 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-454rf" podUID="cf2a6beb-fbfb-4062-b87b-a178033b242c" Apr 17 16:31:15.004740 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:15.004702 2569 generic.go:358] "Generic (PLEG): container finished" podID="10780fa939abfc038e268d8b8ca64834" containerID="02a7ba0bb48710ac5cff43e00c3a8fce24d818d96c65f669b378faefd77c1336" exitCode=0 Apr 17 16:31:15.005665 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:15.005637 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal" event={"ID":"10780fa939abfc038e268d8b8ca64834","Type":"ContainerDied","Data":"02a7ba0bb48710ac5cff43e00c3a8fce24d818d96c65f669b378faefd77c1336"} Apr 17 16:31:15.345598 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:15.345534 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret\") pod \"global-pull-secret-syncer-5vcm7\" (UID: \"63fffba4-9c72-4652-811d-ebe876a5f73b\") " pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:15.345882 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:15.345696 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:15.345882 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:15.345759 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret podName:63fffba4-9c72-4652-811d-ebe876a5f73b nodeName:}" failed. No retries permitted until 2026-04-17 16:31:17.345741127 +0000 UTC m=+7.002548320 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret") pod "global-pull-secret-syncer-5vcm7" (UID: "63fffba4-9c72-4652-811d-ebe876a5f73b") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:16.010630 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:16.010540 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal" event={"ID":"10780fa939abfc038e268d8b8ca64834","Type":"ContainerStarted","Data":"6c61c9f3b8ab28c20f66df79a5904d597a664a05576d2331847593df8350492f"} Apr 17 16:31:16.026657 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:16.025990 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-182.ec2.internal" podStartSLOduration=5.025970393 podStartE2EDuration="5.025970393s" podCreationTimestamp="2026-04-17 16:31:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:16.024757039 +0000 UTC m=+5.681564253" watchObservedRunningTime="2026-04-17 16:31:16.025970393 +0000 UTC m=+5.682777603" Apr 17 16:31:16.556270 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:16.556228 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs\") pod \"network-metrics-daemon-njmgk\" (UID: \"79adaa92-9fae-4abb-b9ae-335440dbe8f1\") " pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:16.556425 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:16.556284 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swls8\" (UniqueName: \"kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8\") pod \"network-check-target-454rf\" (UID: \"cf2a6beb-fbfb-4062-b87b-a178033b242c\") " pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:16.556425 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:16.556397 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:16.556496 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:16.556446 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:16.556496 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:16.556462 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:16.556496 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:16.556475 2569 projected.go:194] Error preparing data for projected volume kube-api-access-swls8 for pod openshift-network-diagnostics/network-check-target-454rf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:16.556496 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:16.556480 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs podName:79adaa92-9fae-4abb-b9ae-335440dbe8f1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:20.55646024 +0000 UTC m=+10.213267438 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs") pod "network-metrics-daemon-njmgk" (UID: "79adaa92-9fae-4abb-b9ae-335440dbe8f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:16.556659 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:16.556523 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8 podName:cf2a6beb-fbfb-4062-b87b-a178033b242c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:20.556507236 +0000 UTC m=+10.213314426 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-swls8" (UniqueName: "kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8") pod "network-check-target-454rf" (UID: "cf2a6beb-fbfb-4062-b87b-a178033b242c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:16.909953 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:16.909659 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:16.909953 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:16.909659 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:16.909953 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:16.909898 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-454rf" podUID="cf2a6beb-fbfb-4062-b87b-a178033b242c" Apr 17 16:31:16.909953 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:16.909949 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:16.910265 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:16.910025 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5vcm7" podUID="63fffba4-9c72-4652-811d-ebe876a5f73b" Apr 17 16:31:16.910459 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:16.909817 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-njmgk" podUID="79adaa92-9fae-4abb-b9ae-335440dbe8f1" Apr 17 16:31:17.362159 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:17.362124 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret\") pod \"global-pull-secret-syncer-5vcm7\" (UID: \"63fffba4-9c72-4652-811d-ebe876a5f73b\") " pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:17.362635 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:17.362281 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:17.362635 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:17.362358 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret podName:63fffba4-9c72-4652-811d-ebe876a5f73b nodeName:}" failed. No retries permitted until 2026-04-17 16:31:21.362339217 +0000 UTC m=+11.019146417 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret") pod "global-pull-secret-syncer-5vcm7" (UID: "63fffba4-9c72-4652-811d-ebe876a5f73b") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:18.909815 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:18.909781 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:18.910251 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:18.909935 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-njmgk" podUID="79adaa92-9fae-4abb-b9ae-335440dbe8f1" Apr 17 16:31:18.910376 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:18.910357 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:18.910492 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:18.910453 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5vcm7" podUID="63fffba4-9c72-4652-811d-ebe876a5f73b" Apr 17 16:31:18.910558 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:18.910486 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:18.910599 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:18.910585 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-454rf" podUID="cf2a6beb-fbfb-4062-b87b-a178033b242c" Apr 17 16:31:20.592772 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:20.592732 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs\") pod \"network-metrics-daemon-njmgk\" (UID: \"79adaa92-9fae-4abb-b9ae-335440dbe8f1\") " pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:20.593249 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:20.592792 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swls8\" (UniqueName: \"kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8\") pod \"network-check-target-454rf\" (UID: \"cf2a6beb-fbfb-4062-b87b-a178033b242c\") " pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:20.593249 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:20.592880 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:20.593249 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:20.592939 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:20.593249 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:20.592958 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs podName:79adaa92-9fae-4abb-b9ae-335440dbe8f1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:28.592937083 +0000 UTC m=+18.249744284 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs") pod "network-metrics-daemon-njmgk" (UID: "79adaa92-9fae-4abb-b9ae-335440dbe8f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:20.593249 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:20.592961 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:20.593249 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:20.592976 2569 projected.go:194] Error preparing data for projected volume kube-api-access-swls8 for pod openshift-network-diagnostics/network-check-target-454rf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:20.593249 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:20.593031 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8 podName:cf2a6beb-fbfb-4062-b87b-a178033b242c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:28.593012669 +0000 UTC m=+18.249819870 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-swls8" (UniqueName: "kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8") pod "network-check-target-454rf" (UID: "cf2a6beb-fbfb-4062-b87b-a178033b242c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:20.910832 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:20.910787 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:20.910988 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:20.910927 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-njmgk" podUID="79adaa92-9fae-4abb-b9ae-335440dbe8f1" Apr 17 16:31:20.911284 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:20.911244 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:20.911284 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:20.911282 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:20.911425 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:20.911367 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5vcm7" podUID="63fffba4-9c72-4652-811d-ebe876a5f73b" Apr 17 16:31:20.911460 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:20.911443 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-454rf" podUID="cf2a6beb-fbfb-4062-b87b-a178033b242c" Apr 17 16:31:21.399414 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:21.399373 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret\") pod \"global-pull-secret-syncer-5vcm7\" (UID: \"63fffba4-9c72-4652-811d-ebe876a5f73b\") " pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:21.399601 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:21.399554 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:21.399667 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:21.399630 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret podName:63fffba4-9c72-4652-811d-ebe876a5f73b nodeName:}" failed. No retries permitted until 2026-04-17 16:31:29.399611156 +0000 UTC m=+19.056418357 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret") pod "global-pull-secret-syncer-5vcm7" (UID: "63fffba4-9c72-4652-811d-ebe876a5f73b") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:22.910237 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:22.910139 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:22.910631 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:22.910139 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:22.910631 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:22.910267 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-454rf" podUID="cf2a6beb-fbfb-4062-b87b-a178033b242c" Apr 17 16:31:22.910631 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:22.910372 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-njmgk" podUID="79adaa92-9fae-4abb-b9ae-335440dbe8f1" Apr 17 16:31:22.910631 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:22.910152 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:22.910631 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:22.910499 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5vcm7" podUID="63fffba4-9c72-4652-811d-ebe876a5f73b" Apr 17 16:31:24.910339 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:24.910297 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:24.910841 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:24.910304 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:24.910841 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:24.910432 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-454rf" podUID="cf2a6beb-fbfb-4062-b87b-a178033b242c" Apr 17 16:31:24.910841 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:24.910304 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:24.910841 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:24.910550 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-njmgk" podUID="79adaa92-9fae-4abb-b9ae-335440dbe8f1" Apr 17 16:31:24.910841 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:24.910659 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5vcm7" podUID="63fffba4-9c72-4652-811d-ebe876a5f73b" Apr 17 16:31:26.910023 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:26.909905 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:26.910023 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:26.909905 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:26.910631 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:26.910035 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-454rf" podUID="cf2a6beb-fbfb-4062-b87b-a178033b242c" Apr 17 16:31:26.910631 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:26.910162 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-njmgk" podUID="79adaa92-9fae-4abb-b9ae-335440dbe8f1" Apr 17 16:31:26.910631 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:26.910197 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:26.910631 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:26.910257 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5vcm7" podUID="63fffba4-9c72-4652-811d-ebe876a5f73b" Apr 17 16:31:28.652868 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:28.652822 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs\") pod \"network-metrics-daemon-njmgk\" (UID: \"79adaa92-9fae-4abb-b9ae-335440dbe8f1\") " pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:28.653397 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:28.652891 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swls8\" (UniqueName: \"kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8\") pod \"network-check-target-454rf\" (UID: \"cf2a6beb-fbfb-4062-b87b-a178033b242c\") " pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:28.653397 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:28.653003 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:28.653397 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:28.653034 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:28.653397 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:28.653052 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:28.653397 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:28.653066 2569 projected.go:194] Error preparing data for projected volume kube-api-access-swls8 for pod openshift-network-diagnostics/network-check-target-454rf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:28.653397 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:28.653090 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs podName:79adaa92-9fae-4abb-b9ae-335440dbe8f1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:44.653063343 +0000 UTC m=+34.309870554 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs") pod "network-metrics-daemon-njmgk" (UID: "79adaa92-9fae-4abb-b9ae-335440dbe8f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:28.653397 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:28.653114 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8 podName:cf2a6beb-fbfb-4062-b87b-a178033b242c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:44.653099556 +0000 UTC m=+34.309906744 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-swls8" (UniqueName: "kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8") pod "network-check-target-454rf" (UID: "cf2a6beb-fbfb-4062-b87b-a178033b242c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:28.910509 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:28.910429 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:28.910673 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:28.910429 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:28.910673 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:28.910571 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-njmgk" podUID="79adaa92-9fae-4abb-b9ae-335440dbe8f1" Apr 17 16:31:28.910673 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:28.910646 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5vcm7" podUID="63fffba4-9c72-4652-811d-ebe876a5f73b" Apr 17 16:31:28.910673 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:28.910430 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:28.910848 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:28.910726 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-454rf" podUID="cf2a6beb-fbfb-4062-b87b-a178033b242c" Apr 17 16:31:29.458034 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:29.457986 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret\") pod \"global-pull-secret-syncer-5vcm7\" (UID: \"63fffba4-9c72-4652-811d-ebe876a5f73b\") " pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:29.458219 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:29.458165 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:29.458277 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:29.458246 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret podName:63fffba4-9c72-4652-811d-ebe876a5f73b nodeName:}" failed. No retries permitted until 2026-04-17 16:31:45.458223361 +0000 UTC m=+35.115030552 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret") pod "global-pull-secret-syncer-5vcm7" (UID: "63fffba4-9c72-4652-811d-ebe876a5f73b") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:30.911142 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:30.911107 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:30.911685 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:30.911235 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-454rf" podUID="cf2a6beb-fbfb-4062-b87b-a178033b242c" Apr 17 16:31:30.911685 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:30.911260 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:30.911685 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:30.911283 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:30.911685 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:30.911380 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-njmgk" podUID="79adaa92-9fae-4abb-b9ae-335440dbe8f1" Apr 17 16:31:30.911685 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:30.911445 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5vcm7" podUID="63fffba4-9c72-4652-811d-ebe876a5f73b" Apr 17 16:31:31.041495 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:31.037779 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qskd2" event={"ID":"c855efcb-8d09-4e04-8063-d5bb5ae67dc3","Type":"ContainerStarted","Data":"0557239ff6710cea7bb7185ccdd29328eb0c51301ac675988b73ea816cd67745"} Apr 17 16:31:31.041495 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:31.039218 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fvk4x" event={"ID":"4415d384-34d0-412a-8d4e-c8f3077b28f5","Type":"ContainerStarted","Data":"30de7cdf198ce55a8846473f40a8fcc779490bad913ea97cbf42dc974ff16b4c"} Apr 17 16:31:31.041495 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:31.040403 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sfzc4" event={"ID":"5c3742c8-dbba-43ea-99fc-4321ab7b1156","Type":"ContainerStarted","Data":"bd8ebb268d9ddaf00f0efa3bcecba2f911b96c43de3401727beeda705b31fa0c"} Apr 17 16:31:31.042237 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:31.042213 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" event={"ID":"56d56159-cb46-4e72-b5a0-a94c8cc6452d","Type":"ContainerStarted","Data":"4dede960d4636bae9a01595dc6e99607e497f3b172e62b402054791512a9aa26"} Apr 17 16:31:31.042303 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:31.042250 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" event={"ID":"56d56159-cb46-4e72-b5a0-a94c8cc6452d","Type":"ContainerStarted","Data":"feb5dfea0ffa23c7757adebec5596b2d55c0bff673405f78b0b0c146b257397d"} Apr 17 16:31:31.042303 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:31.042261 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" event={"ID":"56d56159-cb46-4e72-b5a0-a94c8cc6452d","Type":"ContainerStarted","Data":"c9cb6857d1a7d2a51d04afb1f6ecdcd928ce7855bcd7e8a0e496affb89d37865"} Apr 17 16:31:31.043537 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:31.043513 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-h97kv" event={"ID":"5fc75060-d6e4-4b7a-bba7-b763cd3b68a1","Type":"ContainerStarted","Data":"0c8a02b415bc0e1d9378ead13da21db58a5ab4c7ebb7f19afdff2ee7be685faa"} Apr 17 16:31:31.044590 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:31.044573 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xcmwf" event={"ID":"d56445ed-a7d8-44cf-b91d-6cd2dcb4d6bc","Type":"ContainerStarted","Data":"4b3e5191f75c3c88c8b03a925c777e09018744edefe1d47b40f4b2a78544066d"} Apr 17 16:31:31.045818 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:31.045779 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jdjbz" event={"ID":"611a935a-b080-4fc7-bb9b-c51d000c8434","Type":"ContainerStarted","Data":"f0116c69964ea161d96b5f651b9cc817f70b5b5f1ff7b741b5621b9495204961"} Apr 17 16:31:31.047174 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:31.047156 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" event={"ID":"f63442f4-5899-41ba-b675-c0cb1c7837b8","Type":"ContainerStarted","Data":"279b569b8b2cf91be3ba650fdb6a53c8a0bea53542656052ccc2c436541c233f"} Apr 17 16:31:31.050968 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:31.050935 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qskd2" podStartSLOduration=2.872162448 podStartE2EDuration="20.050925839s" podCreationTimestamp="2026-04-17 16:31:11 +0000 UTC" firstStartedPulling="2026-04-17 16:31:13.411171727 +0000 UTC m=+3.067978915" lastFinishedPulling="2026-04-17 16:31:30.589935112 +0000 UTC m=+20.246742306" observedRunningTime="2026-04-17 16:31:31.050754788 +0000 UTC m=+20.707562007" watchObservedRunningTime="2026-04-17 16:31:31.050925839 +0000 UTC m=+20.707733047" Apr 17 16:31:31.064402 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:31.064354 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xcmwf" podStartSLOduration=7.6194760630000005 podStartE2EDuration="20.06434016s" podCreationTimestamp="2026-04-17 16:31:11 +0000 UTC" firstStartedPulling="2026-04-17 16:31:13.417496313 +0000 UTC m=+3.074303499" lastFinishedPulling="2026-04-17 16:31:25.862360397 +0000 UTC m=+15.519167596" observedRunningTime="2026-04-17 16:31:31.06410621 +0000 UTC m=+20.720913419" watchObservedRunningTime="2026-04-17 16:31:31.06434016 +0000 UTC m=+20.721147367" Apr 17 16:31:31.077616 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:31.077578 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-h97kv" podStartSLOduration=2.903329194 podStartE2EDuration="20.077566553s" podCreationTimestamp="2026-04-17 16:31:11 +0000 UTC" firstStartedPulling="2026-04-17 16:31:13.417503297 +0000 UTC m=+3.074310482" lastFinishedPulling="2026-04-17 16:31:30.591740648 +0000 UTC m=+20.248547841" observedRunningTime="2026-04-17 16:31:31.077451374 +0000 UTC m=+20.734258592" watchObservedRunningTime="2026-04-17 16:31:31.077566553 +0000 UTC m=+20.734373752" Apr 17 16:31:31.106009 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:31.104299 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sfzc4" podStartSLOduration=2.923615751 podStartE2EDuration="20.104281831s" podCreationTimestamp="2026-04-17 16:31:11 +0000 UTC" firstStartedPulling="2026-04-17 16:31:13.409320937 +0000 UTC m=+3.066128127" lastFinishedPulling="2026-04-17 16:31:30.589987019 +0000 UTC m=+20.246794207" observedRunningTime="2026-04-17 16:31:31.103053804 +0000 UTC m=+20.759861014" watchObservedRunningTime="2026-04-17 16:31:31.104281831 +0000 UTC m=+20.761089040" Apr 17 16:31:31.146015 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:31.145952 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jdjbz" podStartSLOduration=2.95087132 podStartE2EDuration="20.145932384s" podCreationTimestamp="2026-04-17 16:31:11 +0000 UTC" firstStartedPulling="2026-04-17 16:31:13.414627613 +0000 UTC m=+3.071434813" lastFinishedPulling="2026-04-17 16:31:30.609688687 +0000 UTC m=+20.266495877" observedRunningTime="2026-04-17 16:31:31.145375024 +0000 UTC m=+20.802182231" watchObservedRunningTime="2026-04-17 16:31:31.145932384 +0000 UTC m=+20.802739635" Apr 17 16:31:31.207787 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:31.207753 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xcmwf" Apr 17 16:31:31.208363 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:31.208346 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xcmwf" Apr 17 16:31:32.051904 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:32.051662 2569 generic.go:358] "Generic (PLEG): container finished" podID="4415d384-34d0-412a-8d4e-c8f3077b28f5" containerID="30de7cdf198ce55a8846473f40a8fcc779490bad913ea97cbf42dc974ff16b4c" exitCode=0 Apr 17 16:31:32.052416 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:32.051745 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fvk4x" event={"ID":"4415d384-34d0-412a-8d4e-c8f3077b28f5","Type":"ContainerDied","Data":"30de7cdf198ce55a8846473f40a8fcc779490bad913ea97cbf42dc974ff16b4c"} Apr 17 16:31:32.054543 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:32.054529 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 16:31:32.054863 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:32.054840 2569 generic.go:358] "Generic (PLEG): container finished" podID="56d56159-cb46-4e72-b5a0-a94c8cc6452d" containerID="feb5dfea0ffa23c7757adebec5596b2d55c0bff673405f78b0b0c146b257397d" exitCode=1 Apr 17 16:31:32.054969 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:32.054935 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" event={"ID":"56d56159-cb46-4e72-b5a0-a94c8cc6452d","Type":"ContainerDied","Data":"feb5dfea0ffa23c7757adebec5596b2d55c0bff673405f78b0b0c146b257397d"} Apr 17 16:31:32.055050 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:32.054979 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" event={"ID":"56d56159-cb46-4e72-b5a0-a94c8cc6452d","Type":"ContainerStarted","Data":"45f1f5f7fef81c4595986f90e768afe775159ceba3d605a8014255b182af0e4a"} Apr 17 16:31:32.055050 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:32.054991 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" event={"ID":"56d56159-cb46-4e72-b5a0-a94c8cc6452d","Type":"ContainerStarted","Data":"65fe6c953ea74b2593016134b57ebd7e15182fd116e1177fc88f6bfca21629ad"} Apr 17 16:31:32.055050 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:32.054998 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" event={"ID":"56d56159-cb46-4e72-b5a0-a94c8cc6452d","Type":"ContainerStarted","Data":"2cd25a2fc1e7994f2b13e4444958b9762d014bdc9995ad23a1ed846a8462894a"} Apr 17 16:31:32.057883 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:32.057834 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xcmwf" Apr 17 16:31:32.057980 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:32.057904 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xcmwf" Apr 17 16:31:32.202054 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:32.202025 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 16:31:32.888944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:32.888840 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T16:31:32.202047739Z","UUID":"cff0d45d-0aa1-4c8c-9350-6643cb86a63b","Handler":null,"Name":"","Endpoint":""} Apr 17 16:31:32.891768 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:32.891745 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 16:31:32.891768 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:32.891771 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 16:31:32.909830 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:32.909785 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:32.909986 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:32.909842 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:32.909986 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:32.909928 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-454rf" podUID="cf2a6beb-fbfb-4062-b87b-a178033b242c" Apr 17 16:31:32.910095 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:32.910005 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:32.910095 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:32.910000 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-njmgk" podUID="79adaa92-9fae-4abb-b9ae-335440dbe8f1" Apr 17 16:31:32.910167 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:32.910118 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5vcm7" podUID="63fffba4-9c72-4652-811d-ebe876a5f73b" Apr 17 16:31:33.058603 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:33.058551 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qcf6j" event={"ID":"d70a0068-8f05-43f8-965d-fae1fcc76d7c","Type":"ContainerStarted","Data":"4d8b94490951178e795afbd0ed11228993418b6bad0b7d0b7a7b3dc8df2038c9"} Apr 17 16:31:33.060591 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:33.060564 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" event={"ID":"f63442f4-5899-41ba-b675-c0cb1c7837b8","Type":"ContainerStarted","Data":"ea6d2abf97cbea1f748e9bd96db1919f4da775ccad67d0243284efaa168dbb4e"} Apr 17 16:31:33.085343 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:33.085280 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qcf6j" podStartSLOduration=4.899857021 podStartE2EDuration="22.085265547s" podCreationTimestamp="2026-04-17 16:31:11 +0000 UTC" firstStartedPulling="2026-04-17 16:31:13.404611914 +0000 UTC m=+3.061419100" lastFinishedPulling="2026-04-17 16:31:30.590020429 +0000 UTC m=+20.246827626" observedRunningTime="2026-04-17 16:31:33.084333653 +0000 UTC m=+22.741140863" watchObservedRunningTime="2026-04-17 16:31:33.085265547 +0000 UTC m=+22.742072755" Apr 17 16:31:34.065084 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:34.064837 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" event={"ID":"f63442f4-5899-41ba-b675-c0cb1c7837b8","Type":"ContainerStarted","Data":"0c921f3537c0d03ad1b11b309449a5928054728cc80214c61ecb806f3abe95ba"} Apr 17 16:31:34.067936 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:34.067914 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 16:31:34.068313 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:34.068290 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" event={"ID":"56d56159-cb46-4e72-b5a0-a94c8cc6452d","Type":"ContainerStarted","Data":"0148c34fb6eb44cd98ad7cb8b0e4350e28a2be6cbe748b3d4cc8876428a47b22"} Apr 17 16:31:34.081940 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:34.081897 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lm86" podStartSLOduration=3.212716682 podStartE2EDuration="23.081885531s" podCreationTimestamp="2026-04-17 16:31:11 +0000 UTC" firstStartedPulling="2026-04-17 16:31:13.412859412 +0000 UTC m=+3.069666604" lastFinishedPulling="2026-04-17 16:31:33.282028267 +0000 UTC m=+22.938835453" observedRunningTime="2026-04-17 16:31:34.081720945 +0000 UTC m=+23.738528152" watchObservedRunningTime="2026-04-17 16:31:34.081885531 +0000 UTC m=+23.738692718" Apr 17 16:31:34.909711 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:34.909684 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:34.909897 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:34.909684 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:34.909897 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:34.909793 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-454rf" podUID="cf2a6beb-fbfb-4062-b87b-a178033b242c" Apr 17 16:31:34.909897 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:34.909685 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:34.910023 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:34.909895 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5vcm7" podUID="63fffba4-9c72-4652-811d-ebe876a5f73b" Apr 17 16:31:34.910023 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:34.909939 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-njmgk" podUID="79adaa92-9fae-4abb-b9ae-335440dbe8f1" Apr 17 16:31:36.909899 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:36.909869 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:36.910691 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:36.909869 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:36.910691 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:36.909985 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-454rf" podUID="cf2a6beb-fbfb-4062-b87b-a178033b242c" Apr 17 16:31:36.910691 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:36.909869 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:36.910691 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:36.910077 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5vcm7" podUID="63fffba4-9c72-4652-811d-ebe876a5f73b" Apr 17 16:31:36.910691 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:36.910168 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-njmgk" podUID="79adaa92-9fae-4abb-b9ae-335440dbe8f1" Apr 17 16:31:38.078631 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:38.078453 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 16:31:38.079172 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:38.078929 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" event={"ID":"56d56159-cb46-4e72-b5a0-a94c8cc6452d","Type":"ContainerStarted","Data":"51db19d085c76bd9eb9509318b97baa31ac08edfe973059f1e0d7e04811672e3"} Apr 17 16:31:38.079282 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:38.079263 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:38.079378 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:38.079291 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:38.079473 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:38.079455 2569 scope.go:117] "RemoveContainer" containerID="feb5dfea0ffa23c7757adebec5596b2d55c0bff673405f78b0b0c146b257397d" Apr 17 16:31:38.080927 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:38.080903 2569 generic.go:358] "Generic (PLEG): container finished" podID="4415d384-34d0-412a-8d4e-c8f3077b28f5" containerID="d00dba56fee34b0bebb491c474f0ad02bd89885673135dbace60ff557f489aab" exitCode=0 Apr 17 16:31:38.081012 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:38.080942 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fvk4x" event={"ID":"4415d384-34d0-412a-8d4e-c8f3077b28f5","Type":"ContainerDied","Data":"d00dba56fee34b0bebb491c474f0ad02bd89885673135dbace60ff557f489aab"} Apr 17 16:31:38.095578 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:38.095554 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:38.095707 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:38.095623 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:38.912089 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:38.912062 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:38.912202 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:38.912062 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:38.912202 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:38.912160 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-njmgk" podUID="79adaa92-9fae-4abb-b9ae-335440dbe8f1" Apr 17 16:31:38.912270 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:38.912062 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:38.912270 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:38.912248 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5vcm7" podUID="63fffba4-9c72-4652-811d-ebe876a5f73b" Apr 17 16:31:38.912356 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:38.912290 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-454rf" podUID="cf2a6beb-fbfb-4062-b87b-a178033b242c" Apr 17 16:31:39.014664 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:39.014635 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5vcm7"] Apr 17 16:31:39.020163 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:39.020121 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-njmgk"] Apr 17 16:31:39.020758 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:39.020736 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-454rf"] Apr 17 16:31:39.084868 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:39.084781 2569 generic.go:358] "Generic (PLEG): container finished" podID="4415d384-34d0-412a-8d4e-c8f3077b28f5" containerID="ad6f1742593b35d41e5f3006706904e6cc5f567b048369e8411a0fbc90ad63a7" exitCode=0 Apr 17 16:31:39.085354 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:39.084869 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fvk4x" event={"ID":"4415d384-34d0-412a-8d4e-c8f3077b28f5","Type":"ContainerDied","Data":"ad6f1742593b35d41e5f3006706904e6cc5f567b048369e8411a0fbc90ad63a7"} Apr 17 16:31:39.088068 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:39.088050 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 16:31:39.088496 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:39.088427 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:39.088496 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:39.088453 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" event={"ID":"56d56159-cb46-4e72-b5a0-a94c8cc6452d","Type":"ContainerStarted","Data":"825caa80518922e70405bec4f91a3570a742e12351ddd845459142249bd3d738"} Apr 17 16:31:39.088653 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:39.088494 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:39.088653 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:39.088523 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:39.088653 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:39.088628 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 16:31:39.088769 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:39.088648 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-njmgk" podUID="79adaa92-9fae-4abb-b9ae-335440dbe8f1" Apr 17 16:31:39.088769 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:39.088718 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-454rf" podUID="cf2a6beb-fbfb-4062-b87b-a178033b242c" Apr 17 16:31:39.088863 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:39.088819 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5vcm7" podUID="63fffba4-9c72-4652-811d-ebe876a5f73b" Apr 17 16:31:39.137179 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:39.137130 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" podStartSLOduration=10.905026129 podStartE2EDuration="28.137115894s" podCreationTimestamp="2026-04-17 16:31:11 +0000 UTC" firstStartedPulling="2026-04-17 16:31:13.40625249 +0000 UTC m=+3.063059689" lastFinishedPulling="2026-04-17 16:31:30.638342264 +0000 UTC m=+20.295149454" observedRunningTime="2026-04-17 16:31:39.136949248 +0000 UTC m=+28.793756456" watchObservedRunningTime="2026-04-17 16:31:39.137115894 +0000 UTC m=+28.793923101" Apr 17 16:31:40.092568 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:40.092536 2569 generic.go:358] "Generic (PLEG): container finished" podID="4415d384-34d0-412a-8d4e-c8f3077b28f5" containerID="b007d7b227eb84137db6c12865d3d6be7b41ba0c0f48e4d73a186b6bc7b00361" exitCode=0 Apr 17 16:31:40.093100 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:40.092614 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fvk4x" event={"ID":"4415d384-34d0-412a-8d4e-c8f3077b28f5","Type":"ContainerDied","Data":"b007d7b227eb84137db6c12865d3d6be7b41ba0c0f48e4d73a186b6bc7b00361"} Apr 17 16:31:40.093100 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:40.092751 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 16:31:40.912722 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:40.912691 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:40.912722 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:40.912711 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:40.912958 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:40.912702 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:40.912958 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:40.912819 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5vcm7" podUID="63fffba4-9c72-4652-811d-ebe876a5f73b" Apr 17 16:31:40.912958 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:40.912913 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-njmgk" podUID="79adaa92-9fae-4abb-b9ae-335440dbe8f1" Apr 17 16:31:40.913115 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:40.912983 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-454rf" podUID="cf2a6beb-fbfb-4062-b87b-a178033b242c" Apr 17 16:31:41.110716 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:41.110682 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:31:42.913405 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:42.913372 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:42.913966 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:42.913372 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:42.913966 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:42.913489 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-454rf" podUID="cf2a6beb-fbfb-4062-b87b-a178033b242c" Apr 17 16:31:42.913966 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:42.913379 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:42.913966 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:42.913544 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5vcm7" podUID="63fffba4-9c72-4652-811d-ebe876a5f73b" Apr 17 16:31:42.913966 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:42.913639 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-njmgk" podUID="79adaa92-9fae-4abb-b9ae-335440dbe8f1" Apr 17 16:31:43.629521 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.629493 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-182.ec2.internal" event="NodeReady" Apr 17 16:31:43.629670 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.629634 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 16:31:43.679014 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.678710 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-786846df7d-8q4xm"] Apr 17 16:31:43.682860 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.682836 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.691589 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.691565 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 16:31:43.691770 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.691753 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 16:31:43.691871 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.691859 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 16:31:43.691936 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.691909 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nd5dg\"" Apr 17 16:31:43.695604 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.695584 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7bhrv"] Apr 17 16:31:43.698617 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.698600 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:31:43.700728 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.700697 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 16:31:43.701202 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.701182 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-786846df7d-8q4xm"] Apr 17 16:31:43.702124 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.702049 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-txqpq\"" Apr 17 16:31:43.702124 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.702100 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 16:31:43.702256 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.702243 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 16:31:43.702306 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.702278 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 16:31:43.708695 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.708674 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7bhrv"] Apr 17 16:31:43.768479 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.768425 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eab9531b-0bb9-4c09-812f-28f31cb4253e-ca-trust-extracted\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.768479 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.768481 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eab9531b-0bb9-4c09-812f-28f31cb4253e-image-registry-private-configuration\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.768696 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.768554 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-certificates\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.768696 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.768585 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eab9531b-0bb9-4c09-812f-28f31cb4253e-trusted-ca\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.768696 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.768602 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-bound-sa-token\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.768696 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.768671 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgtfg\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-kube-api-access-cgtfg\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.768878 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.768720 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eab9531b-0bb9-4c09-812f-28f31cb4253e-installation-pull-secrets\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.768878 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.768778 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.791755 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.791722 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-scnk5"] Apr 17 16:31:43.795080 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.795055 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-scnk5" Apr 17 16:31:43.799398 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.799373 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f96dz\"" Apr 17 16:31:43.799628 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.799370 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 16:31:43.799706 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.799693 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 16:31:43.808616 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.808580 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-scnk5"] Apr 17 16:31:43.869541 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.869508 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cgtfg\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-kube-api-access-cgtfg\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.869791 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.869568 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eab9531b-0bb9-4c09-812f-28f31cb4253e-installation-pull-secrets\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.869791 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.869620 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.869791 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.869652 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert\") pod \"ingress-canary-7bhrv\" (UID: \"db105214-70a5-4d57-b705-6d896bd0f8a3\") " pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:31:43.869791 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.869678 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eab9531b-0bb9-4c09-812f-28f31cb4253e-ca-trust-extracted\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.869791 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.869710 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eab9531b-0bb9-4c09-812f-28f31cb4253e-image-registry-private-configuration\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.869791 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.869755 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-certificates\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.869791 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.869780 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eab9531b-0bb9-4c09-812f-28f31cb4253e-trusted-ca\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.870190 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.869824 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-bound-sa-token\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.870190 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.869847 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9chth\" (UniqueName: \"kubernetes.io/projected/db105214-70a5-4d57-b705-6d896bd0f8a3-kube-api-access-9chth\") pod \"ingress-canary-7bhrv\" (UID: \"db105214-70a5-4d57-b705-6d896bd0f8a3\") " pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:31:43.870288 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:43.870211 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:43.870288 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:43.870231 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-786846df7d-8q4xm: secret "image-registry-tls" not found Apr 17 16:31:43.870288 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.870249 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eab9531b-0bb9-4c09-812f-28f31cb4253e-ca-trust-extracted\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.870425 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:43.870290 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls podName:eab9531b-0bb9-4c09-812f-28f31cb4253e nodeName:}" failed. No retries permitted until 2026-04-17 16:31:44.370269694 +0000 UTC m=+34.027076898 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls") pod "image-registry-786846df7d-8q4xm" (UID: "eab9531b-0bb9-4c09-812f-28f31cb4253e") : secret "image-registry-tls" not found Apr 17 16:31:43.870717 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.870699 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-certificates\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.870846 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.870821 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eab9531b-0bb9-4c09-812f-28f31cb4253e-trusted-ca\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.875032 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.875013 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eab9531b-0bb9-4c09-812f-28f31cb4253e-installation-pull-secrets\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.875144 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.875016 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eab9531b-0bb9-4c09-812f-28f31cb4253e-image-registry-private-configuration\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.882538 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.882516 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-bound-sa-token\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.884504 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.884484 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgtfg\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-kube-api-access-cgtfg\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:43.971010 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.970926 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9chth\" (UniqueName: \"kubernetes.io/projected/db105214-70a5-4d57-b705-6d896bd0f8a3-kube-api-access-9chth\") pod \"ingress-canary-7bhrv\" (UID: \"db105214-70a5-4d57-b705-6d896bd0f8a3\") " pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:31:43.971010 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.970987 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-config-volume\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:31:43.971580 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.971033 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:31:43.971580 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.971121 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert\") pod \"ingress-canary-7bhrv\" (UID: \"db105214-70a5-4d57-b705-6d896bd0f8a3\") " pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:31:43.971580 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.971153 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgww9\" (UniqueName: \"kubernetes.io/projected/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-kube-api-access-kgww9\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:31:43.971580 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.971175 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-tmp-dir\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:31:43.971580 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:43.971275 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:43.971580 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:43.971349 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert podName:db105214-70a5-4d57-b705-6d896bd0f8a3 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:44.471325985 +0000 UTC m=+34.128133186 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert") pod "ingress-canary-7bhrv" (UID: "db105214-70a5-4d57-b705-6d896bd0f8a3") : secret "canary-serving-cert" not found Apr 17 16:31:43.981814 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:43.981771 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9chth\" (UniqueName: \"kubernetes.io/projected/db105214-70a5-4d57-b705-6d896bd0f8a3-kube-api-access-9chth\") pod \"ingress-canary-7bhrv\" (UID: \"db105214-70a5-4d57-b705-6d896bd0f8a3\") " pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:31:44.072079 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.072039 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:31:44.072343 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.072115 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgww9\" (UniqueName: \"kubernetes.io/projected/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-kube-api-access-kgww9\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:31:44.072343 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.072149 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-tmp-dir\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:31:44.072343 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.072208 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-config-volume\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:31:44.072498 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:44.072343 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:44.072498 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:44.072420 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls podName:97268ee4-4d26-4550-8a7e-78ee5bbc45c0 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:44.572396612 +0000 UTC m=+34.229203800 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls") pod "dns-default-scnk5" (UID: "97268ee4-4d26-4550-8a7e-78ee5bbc45c0") : secret "dns-default-metrics-tls" not found Apr 17 16:31:44.072717 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.072690 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-tmp-dir\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:31:44.072962 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.072936 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-config-volume\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:31:44.082055 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.082030 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgww9\" (UniqueName: \"kubernetes.io/projected/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-kube-api-access-kgww9\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:31:44.374123 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.374083 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:44.374307 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:44.374239 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:44.374307 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:44.374255 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-786846df7d-8q4xm: secret "image-registry-tls" not found Apr 17 16:31:44.374402 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:44.374311 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls podName:eab9531b-0bb9-4c09-812f-28f31cb4253e nodeName:}" failed. No retries permitted until 2026-04-17 16:31:45.374296394 +0000 UTC m=+35.031103584 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls") pod "image-registry-786846df7d-8q4xm" (UID: "eab9531b-0bb9-4c09-812f-28f31cb4253e") : secret "image-registry-tls" not found Apr 17 16:31:44.475397 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.475365 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert\") pod \"ingress-canary-7bhrv\" (UID: \"db105214-70a5-4d57-b705-6d896bd0f8a3\") " pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:31:44.475581 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:44.475535 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:44.475634 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:44.475612 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert podName:db105214-70a5-4d57-b705-6d896bd0f8a3 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:45.475592969 +0000 UTC m=+35.132400180 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert") pod "ingress-canary-7bhrv" (UID: "db105214-70a5-4d57-b705-6d896bd0f8a3") : secret "canary-serving-cert" not found Apr 17 16:31:44.576217 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.576183 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:31:44.576411 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:44.576319 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:44.576411 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:44.576373 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls podName:97268ee4-4d26-4550-8a7e-78ee5bbc45c0 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:45.57635963 +0000 UTC m=+35.233166820 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls") pod "dns-default-scnk5" (UID: "97268ee4-4d26-4550-8a7e-78ee5bbc45c0") : secret "dns-default-metrics-tls" not found Apr 17 16:31:44.677177 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.677081 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs\") pod \"network-metrics-daemon-njmgk\" (UID: \"79adaa92-9fae-4abb-b9ae-335440dbe8f1\") " pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:44.677177 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.677145 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swls8\" (UniqueName: \"kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8\") pod \"network-check-target-454rf\" (UID: \"cf2a6beb-fbfb-4062-b87b-a178033b242c\") " pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:44.677403 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:44.677258 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:44.677403 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:44.677322 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:44.677403 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:44.677338 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:44.677403 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:44.677343 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs podName:79adaa92-9fae-4abb-b9ae-335440dbe8f1 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:16.677323628 +0000 UTC m=+66.334130827 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs") pod "network-metrics-daemon-njmgk" (UID: "79adaa92-9fae-4abb-b9ae-335440dbe8f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:44.677403 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:44.677349 2569 projected.go:194] Error preparing data for projected volume kube-api-access-swls8 for pod openshift-network-diagnostics/network-check-target-454rf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:44.677403 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:44.677388 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8 podName:cf2a6beb-fbfb-4062-b87b-a178033b242c nodeName:}" failed. No retries permitted until 2026-04-17 16:32:16.677374459 +0000 UTC m=+66.334181645 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-swls8" (UniqueName: "kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8") pod "network-check-target-454rf" (UID: "cf2a6beb-fbfb-4062-b87b-a178033b242c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:44.909819 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.909771 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:44.909987 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.909831 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:31:44.909987 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.909771 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:31:44.914025 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.913817 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:31:44.914025 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.913828 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bq7nv\"" Apr 17 16:31:44.914025 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.913864 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 16:31:44.914025 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.913821 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:31:44.914025 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.913895 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-z6rg9\"" Apr 17 16:31:44.914025 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:44.913828 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:31:45.381955 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:45.381923 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:45.382383 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:45.382076 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:45.382383 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:45.382096 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-786846df7d-8q4xm: secret "image-registry-tls" not found Apr 17 16:31:45.382383 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:45.382157 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls podName:eab9531b-0bb9-4c09-812f-28f31cb4253e nodeName:}" failed. No retries permitted until 2026-04-17 16:31:47.382142798 +0000 UTC m=+37.038949988 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls") pod "image-registry-786846df7d-8q4xm" (UID: "eab9531b-0bb9-4c09-812f-28f31cb4253e") : secret "image-registry-tls" not found Apr 17 16:31:45.482733 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:45.482695 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret\") pod \"global-pull-secret-syncer-5vcm7\" (UID: \"63fffba4-9c72-4652-811d-ebe876a5f73b\") " pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:45.482931 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:45.482751 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert\") pod \"ingress-canary-7bhrv\" (UID: \"db105214-70a5-4d57-b705-6d896bd0f8a3\") " pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:31:45.483008 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:45.482930 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:45.483061 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:45.483007 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert podName:db105214-70a5-4d57-b705-6d896bd0f8a3 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:47.482987074 +0000 UTC m=+37.139794275 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert") pod "ingress-canary-7bhrv" (UID: "db105214-70a5-4d57-b705-6d896bd0f8a3") : secret "canary-serving-cert" not found Apr 17 16:31:45.485309 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:45.485289 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/63fffba4-9c72-4652-811d-ebe876a5f73b-original-pull-secret\") pod \"global-pull-secret-syncer-5vcm7\" (UID: \"63fffba4-9c72-4652-811d-ebe876a5f73b\") " pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:45.520982 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:45.520937 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5vcm7" Apr 17 16:31:45.583365 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:45.583331 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:31:45.583522 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:45.583494 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:45.583582 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:45.583559 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls podName:97268ee4-4d26-4550-8a7e-78ee5bbc45c0 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:47.583542626 +0000 UTC m=+37.240349815 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls") pod "dns-default-scnk5" (UID: "97268ee4-4d26-4550-8a7e-78ee5bbc45c0") : secret "dns-default-metrics-tls" not found Apr 17 16:31:45.841817 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:45.841597 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5vcm7"] Apr 17 16:31:45.847214 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:45.847186 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63fffba4_9c72_4652_811d_ebe876a5f73b.slice/crio-10c605a82c83f4ffff521eaa88e08d6219c6a2817d41ccbfb70902ea834633bc WatchSource:0}: Error finding container 10c605a82c83f4ffff521eaa88e08d6219c6a2817d41ccbfb70902ea834633bc: Status 404 returned error can't find the container with id 10c605a82c83f4ffff521eaa88e08d6219c6a2817d41ccbfb70902ea834633bc Apr 17 16:31:46.108340 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:46.108299 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fvk4x" event={"ID":"4415d384-34d0-412a-8d4e-c8f3077b28f5","Type":"ContainerStarted","Data":"315cee9eaa72a2a05106d6a4bfcd05a1e833c82cc8f5d99d9b73b51f772507f7"} Apr 17 16:31:46.109317 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:46.109295 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5vcm7" event={"ID":"63fffba4-9c72-4652-811d-ebe876a5f73b","Type":"ContainerStarted","Data":"10c605a82c83f4ffff521eaa88e08d6219c6a2817d41ccbfb70902ea834633bc"} Apr 17 16:31:47.114477 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:47.114434 2569 generic.go:358] "Generic (PLEG): container finished" podID="4415d384-34d0-412a-8d4e-c8f3077b28f5" containerID="315cee9eaa72a2a05106d6a4bfcd05a1e833c82cc8f5d99d9b73b51f772507f7" exitCode=0 Apr 17 16:31:47.114956 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:47.114504 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fvk4x" event={"ID":"4415d384-34d0-412a-8d4e-c8f3077b28f5","Type":"ContainerDied","Data":"315cee9eaa72a2a05106d6a4bfcd05a1e833c82cc8f5d99d9b73b51f772507f7"} Apr 17 16:31:47.397563 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:47.397490 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:47.397716 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:47.397627 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:47.397716 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:47.397643 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-786846df7d-8q4xm: secret "image-registry-tls" not found Apr 17 16:31:47.397716 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:47.397705 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls podName:eab9531b-0bb9-4c09-812f-28f31cb4253e nodeName:}" failed. No retries permitted until 2026-04-17 16:31:51.397687795 +0000 UTC m=+41.054495004 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls") pod "image-registry-786846df7d-8q4xm" (UID: "eab9531b-0bb9-4c09-812f-28f31cb4253e") : secret "image-registry-tls" not found Apr 17 16:31:47.497915 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:47.497869 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert\") pod \"ingress-canary-7bhrv\" (UID: \"db105214-70a5-4d57-b705-6d896bd0f8a3\") " pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:31:47.498073 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:47.498018 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:47.498113 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:47.498076 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert podName:db105214-70a5-4d57-b705-6d896bd0f8a3 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:51.498060897 +0000 UTC m=+41.154868086 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert") pod "ingress-canary-7bhrv" (UID: "db105214-70a5-4d57-b705-6d896bd0f8a3") : secret "canary-serving-cert" not found Apr 17 16:31:47.599013 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:47.598972 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:31:47.599175 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:47.599109 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:47.599227 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:47.599183 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls podName:97268ee4-4d26-4550-8a7e-78ee5bbc45c0 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:51.599164931 +0000 UTC m=+41.255972138 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls") pod "dns-default-scnk5" (UID: "97268ee4-4d26-4550-8a7e-78ee5bbc45c0") : secret "dns-default-metrics-tls" not found Apr 17 16:31:48.119342 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:48.119307 2569 generic.go:358] "Generic (PLEG): container finished" podID="4415d384-34d0-412a-8d4e-c8f3077b28f5" containerID="047a06ae3f9c935f1e42c40a59d397dca984f9f679e876b0fbb29ec60b7ee671" exitCode=0 Apr 17 16:31:48.119774 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:48.119376 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fvk4x" event={"ID":"4415d384-34d0-412a-8d4e-c8f3077b28f5","Type":"ContainerDied","Data":"047a06ae3f9c935f1e42c40a59d397dca984f9f679e876b0fbb29ec60b7ee671"} Apr 17 16:31:49.124734 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:49.124699 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fvk4x" event={"ID":"4415d384-34d0-412a-8d4e-c8f3077b28f5","Type":"ContainerStarted","Data":"5e0334d06af4a720c1b5dac27cafc0bfb4d654fa4e19ff7a475694a19d077c83"} Apr 17 16:31:49.147153 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:49.147096 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fvk4x" podStartSLOduration=5.681910547 podStartE2EDuration="38.147078665s" podCreationTimestamp="2026-04-17 16:31:11 +0000 UTC" firstStartedPulling="2026-04-17 16:31:13.409939886 +0000 UTC m=+3.066747072" lastFinishedPulling="2026-04-17 16:31:45.875107989 +0000 UTC m=+35.531915190" observedRunningTime="2026-04-17 16:31:49.146166891 +0000 UTC m=+38.802974135" watchObservedRunningTime="2026-04-17 16:31:49.147078665 +0000 UTC m=+38.803885875" Apr 17 16:31:51.129724 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:51.129686 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5vcm7" event={"ID":"63fffba4-9c72-4652-811d-ebe876a5f73b","Type":"ContainerStarted","Data":"fdc12d328c324f78d38d648179833a1c6c88a309007a532529a829e9c3e0f0ac"} Apr 17 16:31:51.143257 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:51.143207 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-5vcm7" podStartSLOduration=33.265452219 podStartE2EDuration="38.143194446s" podCreationTimestamp="2026-04-17 16:31:13 +0000 UTC" firstStartedPulling="2026-04-17 16:31:45.85465734 +0000 UTC m=+35.511464526" lastFinishedPulling="2026-04-17 16:31:50.732399562 +0000 UTC m=+40.389206753" observedRunningTime="2026-04-17 16:31:51.142656433 +0000 UTC m=+40.799463643" watchObservedRunningTime="2026-04-17 16:31:51.143194446 +0000 UTC m=+40.800001655" Apr 17 16:31:51.426448 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:51.426357 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:51.426606 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:51.426482 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:51.426606 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:51.426494 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-786846df7d-8q4xm: secret "image-registry-tls" not found Apr 17 16:31:51.426606 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:51.426540 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls podName:eab9531b-0bb9-4c09-812f-28f31cb4253e nodeName:}" failed. No retries permitted until 2026-04-17 16:31:59.426526764 +0000 UTC m=+49.083333954 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls") pod "image-registry-786846df7d-8q4xm" (UID: "eab9531b-0bb9-4c09-812f-28f31cb4253e") : secret "image-registry-tls" not found Apr 17 16:31:51.526944 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:51.526912 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert\") pod \"ingress-canary-7bhrv\" (UID: \"db105214-70a5-4d57-b705-6d896bd0f8a3\") " pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:31:51.527131 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:51.527066 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:51.527131 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:51.527129 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert podName:db105214-70a5-4d57-b705-6d896bd0f8a3 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:59.527114322 +0000 UTC m=+49.183921511 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert") pod "ingress-canary-7bhrv" (UID: "db105214-70a5-4d57-b705-6d896bd0f8a3") : secret "canary-serving-cert" not found Apr 17 16:31:51.627743 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:51.627709 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:31:51.627937 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:51.627905 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:51.628018 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:51.627984 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls podName:97268ee4-4d26-4550-8a7e-78ee5bbc45c0 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:59.627961101 +0000 UTC m=+49.284768288 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls") pod "dns-default-scnk5" (UID: "97268ee4-4d26-4550-8a7e-78ee5bbc45c0") : secret "dns-default-metrics-tls" not found Apr 17 16:31:56.936535 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:56.936502 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7"] Apr 17 16:31:56.940901 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:56.940886 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7" Apr 17 16:31:56.943901 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:56.943872 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 16:31:56.943901 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:56.943896 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 16:31:56.944059 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:56.943897 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 16:31:56.944059 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:56.943872 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 16:31:56.949771 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:56.949747 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7"] Apr 17 16:31:57.065826 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:57.065765 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3e05533e-a468-41b0-ad52-cd5144cff4a7-tmp\") pod \"klusterlet-addon-workmgr-57986bb56c-vsjx7\" (UID: \"3e05533e-a468-41b0-ad52-cd5144cff4a7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7" Apr 17 16:31:57.066005 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:57.065835 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flr2j\" (UniqueName: \"kubernetes.io/projected/3e05533e-a468-41b0-ad52-cd5144cff4a7-kube-api-access-flr2j\") pod \"klusterlet-addon-workmgr-57986bb56c-vsjx7\" (UID: \"3e05533e-a468-41b0-ad52-cd5144cff4a7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7" Apr 17 16:31:57.066005 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:57.065920 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3e05533e-a468-41b0-ad52-cd5144cff4a7-klusterlet-config\") pod \"klusterlet-addon-workmgr-57986bb56c-vsjx7\" (UID: \"3e05533e-a468-41b0-ad52-cd5144cff4a7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7" Apr 17 16:31:57.166913 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:57.166879 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3e05533e-a468-41b0-ad52-cd5144cff4a7-tmp\") pod \"klusterlet-addon-workmgr-57986bb56c-vsjx7\" (UID: \"3e05533e-a468-41b0-ad52-cd5144cff4a7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7" Apr 17 16:31:57.166913 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:57.166915 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flr2j\" (UniqueName: \"kubernetes.io/projected/3e05533e-a468-41b0-ad52-cd5144cff4a7-kube-api-access-flr2j\") pod \"klusterlet-addon-workmgr-57986bb56c-vsjx7\" (UID: \"3e05533e-a468-41b0-ad52-cd5144cff4a7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7" Apr 17 16:31:57.167160 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:57.167093 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3e05533e-a468-41b0-ad52-cd5144cff4a7-klusterlet-config\") pod \"klusterlet-addon-workmgr-57986bb56c-vsjx7\" (UID: \"3e05533e-a468-41b0-ad52-cd5144cff4a7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7" Apr 17 16:31:57.167314 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:57.167296 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3e05533e-a468-41b0-ad52-cd5144cff4a7-tmp\") pod \"klusterlet-addon-workmgr-57986bb56c-vsjx7\" (UID: \"3e05533e-a468-41b0-ad52-cd5144cff4a7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7" Apr 17 16:31:57.171213 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:57.171197 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3e05533e-a468-41b0-ad52-cd5144cff4a7-klusterlet-config\") pod \"klusterlet-addon-workmgr-57986bb56c-vsjx7\" (UID: \"3e05533e-a468-41b0-ad52-cd5144cff4a7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7" Apr 17 16:31:57.174773 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:57.174748 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flr2j\" (UniqueName: \"kubernetes.io/projected/3e05533e-a468-41b0-ad52-cd5144cff4a7-kube-api-access-flr2j\") pod \"klusterlet-addon-workmgr-57986bb56c-vsjx7\" (UID: \"3e05533e-a468-41b0-ad52-cd5144cff4a7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7" Apr 17 16:31:57.250079 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:57.249985 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7" Apr 17 16:31:57.385544 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:57.385517 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7"] Apr 17 16:31:57.388565 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:31:57.388540 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e05533e_a468_41b0_ad52_cd5144cff4a7.slice/crio-117894a7293db8f14d0c30c66cf6d85bf7cf487e31842b94d81718d869dd54f9 WatchSource:0}: Error finding container 117894a7293db8f14d0c30c66cf6d85bf7cf487e31842b94d81718d869dd54f9: Status 404 returned error can't find the container with id 117894a7293db8f14d0c30c66cf6d85bf7cf487e31842b94d81718d869dd54f9 Apr 17 16:31:58.143554 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:58.143516 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7" event={"ID":"3e05533e-a468-41b0-ad52-cd5144cff4a7","Type":"ContainerStarted","Data":"117894a7293db8f14d0c30c66cf6d85bf7cf487e31842b94d81718d869dd54f9"} Apr 17 16:31:59.483588 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:59.483554 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:31:59.483964 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:59.483713 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:59.483964 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:59.483733 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-786846df7d-8q4xm: secret "image-registry-tls" not found Apr 17 16:31:59.483964 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:59.483787 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls podName:eab9531b-0bb9-4c09-812f-28f31cb4253e nodeName:}" failed. No retries permitted until 2026-04-17 16:32:15.483771626 +0000 UTC m=+65.140578815 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls") pod "image-registry-786846df7d-8q4xm" (UID: "eab9531b-0bb9-4c09-812f-28f31cb4253e") : secret "image-registry-tls" not found Apr 17 16:31:59.584812 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:59.584778 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert\") pod \"ingress-canary-7bhrv\" (UID: \"db105214-70a5-4d57-b705-6d896bd0f8a3\") " pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:31:59.584991 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:59.584928 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:59.585035 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:59.584990 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert podName:db105214-70a5-4d57-b705-6d896bd0f8a3 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:15.584974927 +0000 UTC m=+65.241782128 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert") pod "ingress-canary-7bhrv" (UID: "db105214-70a5-4d57-b705-6d896bd0f8a3") : secret "canary-serving-cert" not found Apr 17 16:31:59.686139 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:31:59.686102 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:31:59.686305 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:59.686255 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:59.686343 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:31:59.686312 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls podName:97268ee4-4d26-4550-8a7e-78ee5bbc45c0 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:15.686296371 +0000 UTC m=+65.343103561 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls") pod "dns-default-scnk5" (UID: "97268ee4-4d26-4550-8a7e-78ee5bbc45c0") : secret "dns-default-metrics-tls" not found Apr 17 16:32:07.729849 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:07.729778 2569 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/multicluster-engine/multicloud-manager-rhel9@sha256:78586c172f6852b6656ff964750750f86496d5d04accefb3e857458ef198e253: reading manifest sha256:78586c172f6852b6656ff964750750f86496d5d04accefb3e857458ef198e253 in registry.redhat.io/multicluster-engine/multicloud-manager-rhel9: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image" image="registry.redhat.io/multicluster-engine/multicloud-manager-rhel9@sha256:78586c172f6852b6656ff964750750f86496d5d04accefb3e857458ef198e253" Apr 17 16:32:07.730253 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:07.730024 2569 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:acm-agent,Image:registry.redhat.io/multicluster-engine/multicloud-manager-rhel9@sha256:78586c172f6852b6656ff964750750f86496d5d04accefb3e857458ef198e253,Command:[],Args:[/agent --port=4443 --agent-port=443 --hub-kubeconfig=/var/run/klusterlet/kubeconfig --cluster-name=6496516f-6bf9-4774-907d-2d2b9960e1b4 --agent-name=klusterlet-addon-workmgr],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tmp,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:klusterlet-config,ReadOnly:false,MountPath:/var/run/klusterlet,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-flr2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000590000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod klusterlet-addon-workmgr-57986bb56c-vsjx7_open-cluster-management-agent-addon(3e05533e-a468-41b0-ad52-cd5144cff4a7): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/multicluster-engine/multicloud-manager-rhel9@sha256:78586c172f6852b6656ff964750750f86496d5d04accefb3e857458ef198e253: reading manifest sha256:78586c172f6852b6656ff964750750f86496d5d04accefb3e857458ef198e253 in registry.redhat.io/multicluster-engine/multicloud-manager-rhel9: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 16:32:07.731218 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:07.731187 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"acm-agent\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/multicluster-engine/multicloud-manager-rhel9@sha256:78586c172f6852b6656ff964750750f86496d5d04accefb3e857458ef198e253: reading manifest sha256:78586c172f6852b6656ff964750750f86496d5d04accefb3e857458ef198e253 in registry.redhat.io/multicluster-engine/multicloud-manager-rhel9: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image\"" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7" podUID="3e05533e-a468-41b0-ad52-cd5144cff4a7" Apr 17 16:32:08.162648 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:08.162615 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"acm-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/multicluster-engine/multicloud-manager-rhel9@sha256:78586c172f6852b6656ff964750750f86496d5d04accefb3e857458ef198e253\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/multicluster-engine/multicloud-manager-rhel9@sha256:78586c172f6852b6656ff964750750f86496d5d04accefb3e857458ef198e253: reading manifest sha256:78586c172f6852b6656ff964750750f86496d5d04accefb3e857458ef198e253 in registry.redhat.io/multicluster-engine/multicloud-manager-rhel9: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image\"" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7" podUID="3e05533e-a468-41b0-ad52-cd5144cff4a7" Apr 17 16:32:11.153205 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:11.153173 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2cjj2" Apr 17 16:32:15.493216 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:15.493176 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:32:15.493697 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:15.493344 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:32:15.493697 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:15.493369 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-786846df7d-8q4xm: secret "image-registry-tls" not found Apr 17 16:32:15.493697 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:15.493442 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls podName:eab9531b-0bb9-4c09-812f-28f31cb4253e nodeName:}" failed. No retries permitted until 2026-04-17 16:32:47.493421229 +0000 UTC m=+97.150228421 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls") pod "image-registry-786846df7d-8q4xm" (UID: "eab9531b-0bb9-4c09-812f-28f31cb4253e") : secret "image-registry-tls" not found Apr 17 16:32:15.594413 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:15.594368 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert\") pod \"ingress-canary-7bhrv\" (UID: \"db105214-70a5-4d57-b705-6d896bd0f8a3\") " pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:32:15.594666 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:15.594515 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:15.594666 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:15.594584 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert podName:db105214-70a5-4d57-b705-6d896bd0f8a3 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:47.594569839 +0000 UTC m=+97.251377028 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert") pod "ingress-canary-7bhrv" (UID: "db105214-70a5-4d57-b705-6d896bd0f8a3") : secret "canary-serving-cert" not found Apr 17 16:32:15.695448 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:15.695415 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:32:15.695599 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:15.695577 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:15.695682 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:15.695670 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls podName:97268ee4-4d26-4550-8a7e-78ee5bbc45c0 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:47.695648753 +0000 UTC m=+97.352455956 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls") pod "dns-default-scnk5" (UID: "97268ee4-4d26-4550-8a7e-78ee5bbc45c0") : secret "dns-default-metrics-tls" not found Apr 17 16:32:16.702065 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:16.702007 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs\") pod \"network-metrics-daemon-njmgk\" (UID: \"79adaa92-9fae-4abb-b9ae-335440dbe8f1\") " pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:32:16.702065 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:16.702068 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swls8\" (UniqueName: \"kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8\") pod \"network-check-target-454rf\" (UID: \"cf2a6beb-fbfb-4062-b87b-a178033b242c\") " pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:32:16.705080 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:16.705060 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:32:16.705155 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:16.705064 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:32:16.712614 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:16.712595 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:32:16.712665 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:16.712647 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs podName:79adaa92-9fae-4abb-b9ae-335440dbe8f1 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:20.712632881 +0000 UTC m=+130.369440071 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs") pod "network-metrics-daemon-njmgk" (UID: "79adaa92-9fae-4abb-b9ae-335440dbe8f1") : secret "metrics-daemon-secret" not found Apr 17 16:32:16.714997 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:16.714976 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:32:16.725044 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:16.725022 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swls8\" (UniqueName: \"kubernetes.io/projected/cf2a6beb-fbfb-4062-b87b-a178033b242c-kube-api-access-swls8\") pod \"network-check-target-454rf\" (UID: \"cf2a6beb-fbfb-4062-b87b-a178033b242c\") " pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:32:16.731404 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:16.731384 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-z6rg9\"" Apr 17 16:32:16.739454 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:16.739436 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:32:16.849959 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:16.849930 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-454rf"] Apr 17 16:32:16.853275 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:32:16.853240 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf2a6beb_fbfb_4062_b87b_a178033b242c.slice/crio-b9d880956fae00665cf4bf6d119b7aeb61e66dafd5470c535e3e5e5671492d53 WatchSource:0}: Error finding container b9d880956fae00665cf4bf6d119b7aeb61e66dafd5470c535e3e5e5671492d53: Status 404 returned error can't find the container with id b9d880956fae00665cf4bf6d119b7aeb61e66dafd5470c535e3e5e5671492d53 Apr 17 16:32:17.178407 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:17.178371 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-454rf" event={"ID":"cf2a6beb-fbfb-4062-b87b-a178033b242c","Type":"ContainerStarted","Data":"b9d880956fae00665cf4bf6d119b7aeb61e66dafd5470c535e3e5e5671492d53"} Apr 17 16:32:20.186214 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:20.186185 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-454rf" event={"ID":"cf2a6beb-fbfb-4062-b87b-a178033b242c","Type":"ContainerStarted","Data":"ed193bdf897f6be199c3f529ce9e67695c7da47d7336752b7a6b5a81d1f51c5f"} Apr 17 16:32:20.186583 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:20.186322 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:32:20.201153 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:20.201108 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-454rf" podStartSLOduration=66.491862239 podStartE2EDuration="1m9.201096414s" podCreationTimestamp="2026-04-17 16:31:11 +0000 UTC" firstStartedPulling="2026-04-17 16:32:16.855068635 +0000 UTC m=+66.511875822" lastFinishedPulling="2026-04-17 16:32:19.564302812 +0000 UTC m=+69.221109997" observedRunningTime="2026-04-17 16:32:20.20010215 +0000 UTC m=+69.856909358" watchObservedRunningTime="2026-04-17 16:32:20.201096414 +0000 UTC m=+69.857903622" Apr 17 16:32:26.199993 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:26.199959 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7" event={"ID":"3e05533e-a468-41b0-ad52-cd5144cff4a7","Type":"ContainerStarted","Data":"97adb0c08ede20a2e6a02df32129c3def13f66a1c1d6295e886a8b7b99c141ae"} Apr 17 16:32:26.200505 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:26.200289 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7" Apr 17 16:32:26.201852 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:26.201829 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7" Apr 17 16:32:26.217009 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:26.216959 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57986bb56c-vsjx7" podStartSLOduration=2.158539081 podStartE2EDuration="30.21694592s" podCreationTimestamp="2026-04-17 16:31:56 +0000 UTC" firstStartedPulling="2026-04-17 16:31:57.390275208 +0000 UTC m=+47.047082394" lastFinishedPulling="2026-04-17 16:32:25.448682047 +0000 UTC m=+75.105489233" observedRunningTime="2026-04-17 16:32:26.21641895 +0000 UTC m=+75.873226158" watchObservedRunningTime="2026-04-17 16:32:26.21694592 +0000 UTC m=+75.873753141" Apr 17 16:32:47.528592 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:47.528536 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:32:47.529032 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:47.528683 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:32:47.529032 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:47.528702 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-786846df7d-8q4xm: secret "image-registry-tls" not found Apr 17 16:32:47.529032 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:47.528753 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls podName:eab9531b-0bb9-4c09-812f-28f31cb4253e nodeName:}" failed. No retries permitted until 2026-04-17 16:33:51.528738612 +0000 UTC m=+161.185545800 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls") pod "image-registry-786846df7d-8q4xm" (UID: "eab9531b-0bb9-4c09-812f-28f31cb4253e") : secret "image-registry-tls" not found Apr 17 16:32:47.629900 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:47.629866 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert\") pod \"ingress-canary-7bhrv\" (UID: \"db105214-70a5-4d57-b705-6d896bd0f8a3\") " pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:32:47.630046 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:47.629993 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:47.630046 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:47.630041 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert podName:db105214-70a5-4d57-b705-6d896bd0f8a3 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:51.630027921 +0000 UTC m=+161.286835107 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert") pod "ingress-canary-7bhrv" (UID: "db105214-70a5-4d57-b705-6d896bd0f8a3") : secret "canary-serving-cert" not found Apr 17 16:32:47.731033 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:47.730990 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:32:47.731202 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:47.731108 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:47.731202 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:32:47.731157 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls podName:97268ee4-4d26-4550-8a7e-78ee5bbc45c0 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:51.731143894 +0000 UTC m=+161.387951084 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls") pod "dns-default-scnk5" (UID: "97268ee4-4d26-4550-8a7e-78ee5bbc45c0") : secret "dns-default-metrics-tls" not found Apr 17 16:32:51.191666 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:32:51.191633 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-454rf" Apr 17 16:33:20.765786 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:20.765742 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs\") pod \"network-metrics-daemon-njmgk\" (UID: \"79adaa92-9fae-4abb-b9ae-335440dbe8f1\") " pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:33:20.766278 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:20.765903 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:33:20.766278 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:20.765983 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs podName:79adaa92-9fae-4abb-b9ae-335440dbe8f1 nodeName:}" failed. No retries permitted until 2026-04-17 16:35:22.765966797 +0000 UTC m=+252.422773983 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs") pod "network-metrics-daemon-njmgk" (UID: "79adaa92-9fae-4abb-b9ae-335440dbe8f1") : secret "metrics-daemon-secret" not found Apr 17 16:33:39.167309 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.167273 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7"] Apr 17 16:33:39.170460 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.170428 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7" Apr 17 16:33:39.172612 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.172586 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:33:39.172734 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.172614 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-9htzt\"" Apr 17 16:33:39.173423 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.173383 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 16:33:39.173423 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.173389 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 16:33:39.180636 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.180613 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7"] Apr 17 16:33:39.291653 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.291615 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b652fff-5e0b-465f-873f-276ed25f7476-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wp7s7\" (UID: \"3b652fff-5e0b-465f-873f-276ed25f7476\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7" Apr 17 16:33:39.291851 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.291663 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mgc7\" (UniqueName: \"kubernetes.io/projected/3b652fff-5e0b-465f-873f-276ed25f7476-kube-api-access-4mgc7\") pod \"cluster-samples-operator-6dc5bdb6b4-wp7s7\" (UID: \"3b652fff-5e0b-465f-873f-276ed25f7476\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7" Apr 17 16:33:39.371578 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.371543 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-2nzr7"] Apr 17 16:33:39.374442 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.374418 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2nzr7" Apr 17 16:33:39.374915 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.374887 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch"] Apr 17 16:33:39.377432 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.377414 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-6pv2p\"" Apr 17 16:33:39.377536 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.377518 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-6svzp"] Apr 17 16:33:39.377663 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.377648 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch" Apr 17 16:33:39.379759 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.379734 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 16:33:39.379937 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.379922 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:33:39.380402 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.380383 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6svzp" Apr 17 16:33:39.380468 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.380408 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 16:33:39.380468 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.380458 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-2gqh5\"" Apr 17 16:33:39.380933 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.380916 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 16:33:39.382402 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.382384 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-2z75j\"" Apr 17 16:33:39.382835 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.382820 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 16:33:39.382835 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.382830 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 16:33:39.387497 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.387466 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-2nzr7"] Apr 17 16:33:39.388577 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.388558 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch"] Apr 17 16:33:39.392715 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.392680 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b652fff-5e0b-465f-873f-276ed25f7476-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wp7s7\" (UID: \"3b652fff-5e0b-465f-873f-276ed25f7476\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7" Apr 17 16:33:39.392855 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.392736 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mgc7\" (UniqueName: \"kubernetes.io/projected/3b652fff-5e0b-465f-873f-276ed25f7476-kube-api-access-4mgc7\") pod \"cluster-samples-operator-6dc5bdb6b4-wp7s7\" (UID: \"3b652fff-5e0b-465f-873f-276ed25f7476\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7" Apr 17 16:33:39.393054 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:39.393031 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:33:39.393147 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:39.393124 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b652fff-5e0b-465f-873f-276ed25f7476-samples-operator-tls podName:3b652fff-5e0b-465f-873f-276ed25f7476 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:39.893104423 +0000 UTC m=+149.549911670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3b652fff-5e0b-465f-873f-276ed25f7476-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wp7s7" (UID: "3b652fff-5e0b-465f-873f-276ed25f7476") : secret "samples-operator-tls" not found Apr 17 16:33:39.403300 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.403271 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-6svzp"] Apr 17 16:33:39.405939 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.405917 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mgc7\" (UniqueName: \"kubernetes.io/projected/3b652fff-5e0b-465f-873f-276ed25f7476-kube-api-access-4mgc7\") pod \"cluster-samples-operator-6dc5bdb6b4-wp7s7\" (UID: \"3b652fff-5e0b-465f-873f-276ed25f7476\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7" Apr 17 16:33:39.494045 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.493959 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6svzp\" (UID: \"8dec519d-f277-4d63-80ab-1edcb2ec1275\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6svzp" Apr 17 16:33:39.494045 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.493993 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxpkb\" (UniqueName: \"kubernetes.io/projected/f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068-kube-api-access-mxpkb\") pod \"kube-storage-version-migrator-operator-6769c5d45-zlzch\" (UID: \"f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch" Apr 17 16:33:39.494045 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.494027 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm9v5\" (UniqueName: \"kubernetes.io/projected/b47a469b-6847-4d3a-b7a8-64dad8f5db47-kube-api-access-nm9v5\") pod \"network-check-source-8894fc9bd-2nzr7\" (UID: \"b47a469b-6847-4d3a-b7a8-64dad8f5db47\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2nzr7" Apr 17 16:33:39.494250 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.494084 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8dec519d-f277-4d63-80ab-1edcb2ec1275-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-6svzp\" (UID: \"8dec519d-f277-4d63-80ab-1edcb2ec1275\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6svzp" Apr 17 16:33:39.494250 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.494130 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-zlzch\" (UID: \"f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch" Apr 17 16:33:39.494250 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.494192 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-zlzch\" (UID: \"f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch" Apr 17 16:33:39.594505 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.594471 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6svzp\" (UID: \"8dec519d-f277-4d63-80ab-1edcb2ec1275\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6svzp" Apr 17 16:33:39.594594 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.594508 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxpkb\" (UniqueName: \"kubernetes.io/projected/f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068-kube-api-access-mxpkb\") pod \"kube-storage-version-migrator-operator-6769c5d45-zlzch\" (UID: \"f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch" Apr 17 16:33:39.594634 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:39.594607 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:33:39.594664 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.594631 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nm9v5\" (UniqueName: \"kubernetes.io/projected/b47a469b-6847-4d3a-b7a8-64dad8f5db47-kube-api-access-nm9v5\") pod \"network-check-source-8894fc9bd-2nzr7\" (UID: \"b47a469b-6847-4d3a-b7a8-64dad8f5db47\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2nzr7" Apr 17 16:33:39.594697 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:39.594674 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert podName:8dec519d-f277-4d63-80ab-1edcb2ec1275 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:40.094659222 +0000 UTC m=+149.751466408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6svzp" (UID: "8dec519d-f277-4d63-80ab-1edcb2ec1275") : secret "networking-console-plugin-cert" not found Apr 17 16:33:39.594753 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.594705 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8dec519d-f277-4d63-80ab-1edcb2ec1275-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-6svzp\" (UID: \"8dec519d-f277-4d63-80ab-1edcb2ec1275\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6svzp" Apr 17 16:33:39.594791 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.594762 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-zlzch\" (UID: \"f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch" Apr 17 16:33:39.594862 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.594825 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-zlzch\" (UID: \"f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch" Apr 17 16:33:39.595320 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.595295 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-zlzch\" (UID: \"f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch" Apr 17 16:33:39.595430 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.595393 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8dec519d-f277-4d63-80ab-1edcb2ec1275-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-6svzp\" (UID: \"8dec519d-f277-4d63-80ab-1edcb2ec1275\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6svzp" Apr 17 16:33:39.596910 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.596895 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-zlzch\" (UID: \"f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch" Apr 17 16:33:39.605065 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.605040 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm9v5\" (UniqueName: \"kubernetes.io/projected/b47a469b-6847-4d3a-b7a8-64dad8f5db47-kube-api-access-nm9v5\") pod \"network-check-source-8894fc9bd-2nzr7\" (UID: \"b47a469b-6847-4d3a-b7a8-64dad8f5db47\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2nzr7" Apr 17 16:33:39.605161 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.605072 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxpkb\" (UniqueName: \"kubernetes.io/projected/f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068-kube-api-access-mxpkb\") pod \"kube-storage-version-migrator-operator-6769c5d45-zlzch\" (UID: \"f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch" Apr 17 16:33:39.684673 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.684635 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2nzr7" Apr 17 16:33:39.690878 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.690853 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch" Apr 17 16:33:39.821343 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.821313 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-2nzr7"] Apr 17 16:33:39.824004 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:33:39.823978 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb47a469b_6847_4d3a_b7a8_64dad8f5db47.slice/crio-868a5c7b1397dc26f53a61308f580d3fe5b9157936774bc8392df007a44f8d1d WatchSource:0}: Error finding container 868a5c7b1397dc26f53a61308f580d3fe5b9157936774bc8392df007a44f8d1d: Status 404 returned error can't find the container with id 868a5c7b1397dc26f53a61308f580d3fe5b9157936774bc8392df007a44f8d1d Apr 17 16:33:39.831067 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.831046 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch"] Apr 17 16:33:39.834620 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:33:39.834593 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4bc3e8d_bd1c_4102_aab9_bacaaeb0e068.slice/crio-9cb111d4ece4ff175a1de40ce495da8a31410e4658db6d6b89e4aac3e593d683 WatchSource:0}: Error finding container 9cb111d4ece4ff175a1de40ce495da8a31410e4658db6d6b89e4aac3e593d683: Status 404 returned error can't find the container with id 9cb111d4ece4ff175a1de40ce495da8a31410e4658db6d6b89e4aac3e593d683 Apr 17 16:33:39.897280 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:39.897253 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b652fff-5e0b-465f-873f-276ed25f7476-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wp7s7\" (UID: \"3b652fff-5e0b-465f-873f-276ed25f7476\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7" Apr 17 16:33:39.897416 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:39.897394 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:33:39.897479 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:39.897448 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b652fff-5e0b-465f-873f-276ed25f7476-samples-operator-tls podName:3b652fff-5e0b-465f-873f-276ed25f7476 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:40.897429902 +0000 UTC m=+150.554237097 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3b652fff-5e0b-465f-873f-276ed25f7476-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wp7s7" (UID: "3b652fff-5e0b-465f-873f-276ed25f7476") : secret "samples-operator-tls" not found Apr 17 16:33:40.099117 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:40.099075 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6svzp\" (UID: \"8dec519d-f277-4d63-80ab-1edcb2ec1275\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6svzp" Apr 17 16:33:40.099291 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:40.099231 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:33:40.099332 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:40.099300 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert podName:8dec519d-f277-4d63-80ab-1edcb2ec1275 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:41.099282329 +0000 UTC m=+150.756089530 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6svzp" (UID: "8dec519d-f277-4d63-80ab-1edcb2ec1275") : secret "networking-console-plugin-cert" not found Apr 17 16:33:40.337948 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:40.337882 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch" event={"ID":"f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068","Type":"ContainerStarted","Data":"9cb111d4ece4ff175a1de40ce495da8a31410e4658db6d6b89e4aac3e593d683"} Apr 17 16:33:40.339337 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:40.339305 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2nzr7" event={"ID":"b47a469b-6847-4d3a-b7a8-64dad8f5db47","Type":"ContainerStarted","Data":"2e7074afc83dd3e7fc298736a4a5b1df63a537459d17f8b94f76e4f34f0ea29f"} Apr 17 16:33:40.339458 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:40.339339 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2nzr7" event={"ID":"b47a469b-6847-4d3a-b7a8-64dad8f5db47","Type":"ContainerStarted","Data":"868a5c7b1397dc26f53a61308f580d3fe5b9157936774bc8392df007a44f8d1d"} Apr 17 16:33:40.355208 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:40.355118 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2nzr7" podStartSLOduration=1.35510561 podStartE2EDuration="1.35510561s" podCreationTimestamp="2026-04-17 16:33:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:33:40.354225198 +0000 UTC m=+150.011032419" watchObservedRunningTime="2026-04-17 16:33:40.35510561 +0000 UTC m=+150.011912817" Apr 17 16:33:40.906324 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:40.906292 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b652fff-5e0b-465f-873f-276ed25f7476-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wp7s7\" (UID: \"3b652fff-5e0b-465f-873f-276ed25f7476\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7" Apr 17 16:33:40.906489 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:40.906420 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:33:40.906489 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:40.906473 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b652fff-5e0b-465f-873f-276ed25f7476-samples-operator-tls podName:3b652fff-5e0b-465f-873f-276ed25f7476 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:42.906458527 +0000 UTC m=+152.563265717 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3b652fff-5e0b-465f-873f-276ed25f7476-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wp7s7" (UID: "3b652fff-5e0b-465f-873f-276ed25f7476") : secret "samples-operator-tls" not found Apr 17 16:33:41.107754 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:41.107692 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6svzp\" (UID: \"8dec519d-f277-4d63-80ab-1edcb2ec1275\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6svzp" Apr 17 16:33:41.107966 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:41.107881 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:33:41.107966 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:41.107949 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert podName:8dec519d-f277-4d63-80ab-1edcb2ec1275 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:43.107933347 +0000 UTC m=+152.764740538 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6svzp" (UID: "8dec519d-f277-4d63-80ab-1edcb2ec1275") : secret "networking-console-plugin-cert" not found Apr 17 16:33:42.345553 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:42.345518 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch" event={"ID":"f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068","Type":"ContainerStarted","Data":"cc46971ddca344b7a29b3c6385cf177b09d51742d9745dbe033508de85b8b172"} Apr 17 16:33:42.364617 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:42.364567 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch" podStartSLOduration=1.435336766 podStartE2EDuration="3.364552612s" podCreationTimestamp="2026-04-17 16:33:39 +0000 UTC" firstStartedPulling="2026-04-17 16:33:39.83618978 +0000 UTC m=+149.492996966" lastFinishedPulling="2026-04-17 16:33:41.765405609 +0000 UTC m=+151.422212812" observedRunningTime="2026-04-17 16:33:42.363313874 +0000 UTC m=+152.020121081" watchObservedRunningTime="2026-04-17 16:33:42.364552612 +0000 UTC m=+152.021359823" Apr 17 16:33:42.920334 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:42.920300 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b652fff-5e0b-465f-873f-276ed25f7476-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wp7s7\" (UID: \"3b652fff-5e0b-465f-873f-276ed25f7476\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7" Apr 17 16:33:42.920536 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:42.920443 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:33:42.920536 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:42.920510 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b652fff-5e0b-465f-873f-276ed25f7476-samples-operator-tls podName:3b652fff-5e0b-465f-873f-276ed25f7476 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:46.920491675 +0000 UTC m=+156.577298863 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3b652fff-5e0b-465f-873f-276ed25f7476-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wp7s7" (UID: "3b652fff-5e0b-465f-873f-276ed25f7476") : secret "samples-operator-tls" not found Apr 17 16:33:43.122170 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:43.122122 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6svzp\" (UID: \"8dec519d-f277-4d63-80ab-1edcb2ec1275\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6svzp" Apr 17 16:33:43.122363 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:43.122228 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:33:43.122363 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:43.122326 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert podName:8dec519d-f277-4d63-80ab-1edcb2ec1275 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:47.122286506 +0000 UTC m=+156.779093695 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6svzp" (UID: "8dec519d-f277-4d63-80ab-1edcb2ec1275") : secret "networking-console-plugin-cert" not found Apr 17 16:33:46.693840 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:46.693760 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-786846df7d-8q4xm" podUID="eab9531b-0bb9-4c09-812f-28f31cb4253e" Apr 17 16:33:46.709901 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:46.709860 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-7bhrv" podUID="db105214-70a5-4d57-b705-6d896bd0f8a3" Apr 17 16:33:46.807423 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:46.807388 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-scnk5" podUID="97268ee4-4d26-4550-8a7e-78ee5bbc45c0" Apr 17 16:33:46.951154 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:46.951076 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b652fff-5e0b-465f-873f-276ed25f7476-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wp7s7\" (UID: \"3b652fff-5e0b-465f-873f-276ed25f7476\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7" Apr 17 16:33:46.951290 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:46.951231 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:33:46.951328 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:46.951323 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b652fff-5e0b-465f-873f-276ed25f7476-samples-operator-tls podName:3b652fff-5e0b-465f-873f-276ed25f7476 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:54.951304092 +0000 UTC m=+164.608111288 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3b652fff-5e0b-465f-873f-276ed25f7476-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wp7s7" (UID: "3b652fff-5e0b-465f-873f-276ed25f7476") : secret "samples-operator-tls" not found Apr 17 16:33:46.991066 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:46.991040 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qskd2_c855efcb-8d09-4e04-8063-d5bb5ae67dc3/dns-node-resolver/0.log" Apr 17 16:33:47.152608 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:47.152571 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6svzp\" (UID: \"8dec519d-f277-4d63-80ab-1edcb2ec1275\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6svzp" Apr 17 16:33:47.152746 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:47.152720 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:33:47.152821 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:47.152789 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert podName:8dec519d-f277-4d63-80ab-1edcb2ec1275 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:55.152771644 +0000 UTC m=+164.809578835 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6svzp" (UID: "8dec519d-f277-4d63-80ab-1edcb2ec1275") : secret "networking-console-plugin-cert" not found Apr 17 16:33:47.358715 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:47.358687 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:33:47.358904 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:47.358686 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:33:47.358904 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:47.358686 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-scnk5" Apr 17 16:33:47.792253 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:47.792225 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sfzc4_5c3742c8-dbba-43ea-99fc-4321ab7b1156/node-ca/0.log" Apr 17 16:33:47.934697 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:47.934658 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-njmgk" podUID="79adaa92-9fae-4abb-b9ae-335440dbe8f1" Apr 17 16:33:49.393189 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:49.393160 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-zlzch_f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068/kube-storage-version-migrator-operator/0.log" Apr 17 16:33:51.586005 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:51.585952 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls\") pod \"image-registry-786846df7d-8q4xm\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:33:51.586467 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:51.586090 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:33:51.586467 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:51.586110 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-786846df7d-8q4xm: secret "image-registry-tls" not found Apr 17 16:33:51.586467 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:51.586175 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls podName:eab9531b-0bb9-4c09-812f-28f31cb4253e nodeName:}" failed. No retries permitted until 2026-04-17 16:35:53.586159701 +0000 UTC m=+283.242966887 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls") pod "image-registry-786846df7d-8q4xm" (UID: "eab9531b-0bb9-4c09-812f-28f31cb4253e") : secret "image-registry-tls" not found Apr 17 16:33:51.687240 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:51.687209 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert\") pod \"ingress-canary-7bhrv\" (UID: \"db105214-70a5-4d57-b705-6d896bd0f8a3\") " pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:33:51.687397 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:51.687377 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:33:51.687475 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:51.687466 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert podName:db105214-70a5-4d57-b705-6d896bd0f8a3 nodeName:}" failed. No retries permitted until 2026-04-17 16:35:53.68744642 +0000 UTC m=+283.344253606 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert") pod "ingress-canary-7bhrv" (UID: "db105214-70a5-4d57-b705-6d896bd0f8a3") : secret "canary-serving-cert" not found Apr 17 16:33:51.787862 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:51.787823 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:33:51.788076 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:51.787971 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:33:51.788076 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:51.788031 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls podName:97268ee4-4d26-4550-8a7e-78ee5bbc45c0 nodeName:}" failed. No retries permitted until 2026-04-17 16:35:53.788017324 +0000 UTC m=+283.444824512 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls") pod "dns-default-scnk5" (UID: "97268ee4-4d26-4550-8a7e-78ee5bbc45c0") : secret "dns-default-metrics-tls" not found Apr 17 16:33:55.011345 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:55.011292 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b652fff-5e0b-465f-873f-276ed25f7476-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wp7s7\" (UID: \"3b652fff-5e0b-465f-873f-276ed25f7476\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7" Apr 17 16:33:55.013624 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:55.013596 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b652fff-5e0b-465f-873f-276ed25f7476-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wp7s7\" (UID: \"3b652fff-5e0b-465f-873f-276ed25f7476\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7" Apr 17 16:33:55.078962 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:55.078934 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7" Apr 17 16:33:55.199319 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:55.199287 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7"] Apr 17 16:33:55.213064 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:55.213038 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6svzp\" (UID: \"8dec519d-f277-4d63-80ab-1edcb2ec1275\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6svzp" Apr 17 16:33:55.213195 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:55.213180 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 16:33:55.213253 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:33:55.213243 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert podName:8dec519d-f277-4d63-80ab-1edcb2ec1275 nodeName:}" failed. No retries permitted until 2026-04-17 16:34:11.213229267 +0000 UTC m=+180.870036457 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6svzp" (UID: "8dec519d-f277-4d63-80ab-1edcb2ec1275") : secret "networking-console-plugin-cert" not found Apr 17 16:33:55.378812 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:55.378704 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7" event={"ID":"3b652fff-5e0b-465f-873f-276ed25f7476","Type":"ContainerStarted","Data":"6057ba5ff0063a4eb0309097ec330e80343014c9dadf5e1865aca5d2b3c87812"} Apr 17 16:33:57.385714 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:57.385675 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7" event={"ID":"3b652fff-5e0b-465f-873f-276ed25f7476","Type":"ContainerStarted","Data":"bab3ab35391052879d7276acbb9fbefec13085b787594acc01317ee83362d182"} Apr 17 16:33:57.386116 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:57.385720 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7" event={"ID":"3b652fff-5e0b-465f-873f-276ed25f7476","Type":"ContainerStarted","Data":"f57c4367530111231307d82453c5a3f0a08fdc0193be13056d72844a5504a362"} Apr 17 16:33:57.402756 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:57.402712 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wp7s7" podStartSLOduration=16.79510724 podStartE2EDuration="18.402696508s" podCreationTimestamp="2026-04-17 16:33:39 +0000 UTC" firstStartedPulling="2026-04-17 16:33:55.24211482 +0000 UTC m=+164.898922021" lastFinishedPulling="2026-04-17 16:33:56.8497041 +0000 UTC m=+166.506511289" observedRunningTime="2026-04-17 16:33:57.400998429 +0000 UTC m=+167.057805637" watchObservedRunningTime="2026-04-17 16:33:57.402696508 +0000 UTC m=+167.059503716" Apr 17 16:33:58.909535 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:33:58.909492 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:34:09.243197 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.243160 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-wn6rg"] Apr 17 16:34:09.246335 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.246313 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-wn6rg" Apr 17 16:34:09.249469 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.249447 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 16:34:09.249580 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.249446 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 16:34:09.249580 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.249526 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-8h42v\"" Apr 17 16:34:09.266439 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.266407 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-wn6rg"] Apr 17 16:34:09.267220 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.267202 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-bnmth"] Apr 17 16:34:09.270174 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.270158 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bnmth" Apr 17 16:34:09.272886 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.272863 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 16:34:09.272998 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.272884 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 16:34:09.273152 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.273137 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-m9k6s\"" Apr 17 16:34:09.273215 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.273136 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 16:34:09.273636 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.273621 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 16:34:09.283259 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.283237 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bnmth"] Apr 17 16:34:09.324877 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.324843 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/85668985-17d4-4c12-88cd-1204c9fd0791-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bnmth\" (UID: \"85668985-17d4-4c12-88cd-1204c9fd0791\") " pod="openshift-insights/insights-runtime-extractor-bnmth" Apr 17 16:34:09.324877 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.324881 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhrgv\" (UniqueName: \"kubernetes.io/projected/85668985-17d4-4c12-88cd-1204c9fd0791-kube-api-access-mhrgv\") pod \"insights-runtime-extractor-bnmth\" (UID: \"85668985-17d4-4c12-88cd-1204c9fd0791\") " pod="openshift-insights/insights-runtime-extractor-bnmth" Apr 17 16:34:09.325094 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.324913 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/85668985-17d4-4c12-88cd-1204c9fd0791-data-volume\") pod \"insights-runtime-extractor-bnmth\" (UID: \"85668985-17d4-4c12-88cd-1204c9fd0791\") " pod="openshift-insights/insights-runtime-extractor-bnmth" Apr 17 16:34:09.325094 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.325033 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/85668985-17d4-4c12-88cd-1204c9fd0791-crio-socket\") pod \"insights-runtime-extractor-bnmth\" (UID: \"85668985-17d4-4c12-88cd-1204c9fd0791\") " pod="openshift-insights/insights-runtime-extractor-bnmth" Apr 17 16:34:09.325094 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.325064 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5482\" (UniqueName: \"kubernetes.io/projected/f8e80855-0ca0-4413-990c-7bfef055921c-kube-api-access-k5482\") pod \"downloads-6bcc868b7-wn6rg\" (UID: \"f8e80855-0ca0-4413-990c-7bfef055921c\") " pod="openshift-console/downloads-6bcc868b7-wn6rg" Apr 17 16:34:09.325094 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.325087 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/85668985-17d4-4c12-88cd-1204c9fd0791-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bnmth\" (UID: \"85668985-17d4-4c12-88cd-1204c9fd0791\") " pod="openshift-insights/insights-runtime-extractor-bnmth" Apr 17 16:34:09.426147 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.426117 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/85668985-17d4-4c12-88cd-1204c9fd0791-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bnmth\" (UID: \"85668985-17d4-4c12-88cd-1204c9fd0791\") " pod="openshift-insights/insights-runtime-extractor-bnmth" Apr 17 16:34:09.426326 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.426208 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/85668985-17d4-4c12-88cd-1204c9fd0791-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bnmth\" (UID: \"85668985-17d4-4c12-88cd-1204c9fd0791\") " pod="openshift-insights/insights-runtime-extractor-bnmth" Apr 17 16:34:09.426326 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.426242 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhrgv\" (UniqueName: \"kubernetes.io/projected/85668985-17d4-4c12-88cd-1204c9fd0791-kube-api-access-mhrgv\") pod \"insights-runtime-extractor-bnmth\" (UID: \"85668985-17d4-4c12-88cd-1204c9fd0791\") " pod="openshift-insights/insights-runtime-extractor-bnmth" Apr 17 16:34:09.426326 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.426271 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/85668985-17d4-4c12-88cd-1204c9fd0791-data-volume\") pod \"insights-runtime-extractor-bnmth\" (UID: \"85668985-17d4-4c12-88cd-1204c9fd0791\") " pod="openshift-insights/insights-runtime-extractor-bnmth" Apr 17 16:34:09.426474 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.426369 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/85668985-17d4-4c12-88cd-1204c9fd0791-crio-socket\") pod \"insights-runtime-extractor-bnmth\" (UID: \"85668985-17d4-4c12-88cd-1204c9fd0791\") " pod="openshift-insights/insights-runtime-extractor-bnmth" Apr 17 16:34:09.426474 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.426395 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5482\" (UniqueName: \"kubernetes.io/projected/f8e80855-0ca0-4413-990c-7bfef055921c-kube-api-access-k5482\") pod \"downloads-6bcc868b7-wn6rg\" (UID: \"f8e80855-0ca0-4413-990c-7bfef055921c\") " pod="openshift-console/downloads-6bcc868b7-wn6rg" Apr 17 16:34:09.426576 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.426511 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/85668985-17d4-4c12-88cd-1204c9fd0791-crio-socket\") pod \"insights-runtime-extractor-bnmth\" (UID: \"85668985-17d4-4c12-88cd-1204c9fd0791\") " pod="openshift-insights/insights-runtime-extractor-bnmth" Apr 17 16:34:09.426725 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.426706 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/85668985-17d4-4c12-88cd-1204c9fd0791-data-volume\") pod \"insights-runtime-extractor-bnmth\" (UID: \"85668985-17d4-4c12-88cd-1204c9fd0791\") " pod="openshift-insights/insights-runtime-extractor-bnmth" Apr 17 16:34:09.426853 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.426783 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/85668985-17d4-4c12-88cd-1204c9fd0791-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bnmth\" (UID: \"85668985-17d4-4c12-88cd-1204c9fd0791\") " pod="openshift-insights/insights-runtime-extractor-bnmth" Apr 17 16:34:09.428604 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.428576 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/85668985-17d4-4c12-88cd-1204c9fd0791-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bnmth\" (UID: \"85668985-17d4-4c12-88cd-1204c9fd0791\") " pod="openshift-insights/insights-runtime-extractor-bnmth" Apr 17 16:34:09.436648 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.436618 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhrgv\" (UniqueName: \"kubernetes.io/projected/85668985-17d4-4c12-88cd-1204c9fd0791-kube-api-access-mhrgv\") pod \"insights-runtime-extractor-bnmth\" (UID: \"85668985-17d4-4c12-88cd-1204c9fd0791\") " pod="openshift-insights/insights-runtime-extractor-bnmth" Apr 17 16:34:09.437029 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.437014 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5482\" (UniqueName: \"kubernetes.io/projected/f8e80855-0ca0-4413-990c-7bfef055921c-kube-api-access-k5482\") pod \"downloads-6bcc868b7-wn6rg\" (UID: \"f8e80855-0ca0-4413-990c-7bfef055921c\") " pod="openshift-console/downloads-6bcc868b7-wn6rg" Apr 17 16:34:09.554554 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.554520 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-wn6rg" Apr 17 16:34:09.579627 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.579594 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bnmth" Apr 17 16:34:09.676787 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.676756 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-wn6rg"] Apr 17 16:34:09.683242 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:34:09.683196 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8e80855_0ca0_4413_990c_7bfef055921c.slice/crio-38f65266107b684fbd1a4807376eb44ffc7fc4c515422835994e86f4f1abfe09 WatchSource:0}: Error finding container 38f65266107b684fbd1a4807376eb44ffc7fc4c515422835994e86f4f1abfe09: Status 404 returned error can't find the container with id 38f65266107b684fbd1a4807376eb44ffc7fc4c515422835994e86f4f1abfe09 Apr 17 16:34:09.716696 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:09.716665 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bnmth"] Apr 17 16:34:09.719458 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:34:09.719430 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85668985_17d4_4c12_88cd_1204c9fd0791.slice/crio-6d2e285a1e313bc587c43c1a89394245057559f02aa8535265cb4ebce0ecf32f WatchSource:0}: Error finding container 6d2e285a1e313bc587c43c1a89394245057559f02aa8535265cb4ebce0ecf32f: Status 404 returned error can't find the container with id 6d2e285a1e313bc587c43c1a89394245057559f02aa8535265cb4ebce0ecf32f Apr 17 16:34:10.419305 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:10.419259 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bnmth" event={"ID":"85668985-17d4-4c12-88cd-1204c9fd0791","Type":"ContainerStarted","Data":"449bd6fbbb3242a2afa20ae8bdf9dfa073144e6a2741ba709a269ee60cec23fc"} Apr 17 16:34:10.419305 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:10.419303 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bnmth" event={"ID":"85668985-17d4-4c12-88cd-1204c9fd0791","Type":"ContainerStarted","Data":"34ab79d5afc532d3979126f43e398bc3b2dd1ba0377a59aac950a7645ec0596b"} Apr 17 16:34:10.419775 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:10.419318 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bnmth" event={"ID":"85668985-17d4-4c12-88cd-1204c9fd0791","Type":"ContainerStarted","Data":"6d2e285a1e313bc587c43c1a89394245057559f02aa8535265cb4ebce0ecf32f"} Apr 17 16:34:10.420352 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:10.420324 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-wn6rg" event={"ID":"f8e80855-0ca0-4413-990c-7bfef055921c","Type":"ContainerStarted","Data":"38f65266107b684fbd1a4807376eb44ffc7fc4c515422835994e86f4f1abfe09"} Apr 17 16:34:11.241061 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:11.241020 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6svzp\" (UID: \"8dec519d-f277-4d63-80ab-1edcb2ec1275\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6svzp" Apr 17 16:34:11.244147 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:11.244121 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8dec519d-f277-4d63-80ab-1edcb2ec1275-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6svzp\" (UID: \"8dec519d-f277-4d63-80ab-1edcb2ec1275\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6svzp" Apr 17 16:34:11.499345 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:11.499263 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-2z75j\"" Apr 17 16:34:11.507495 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:11.507406 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6svzp" Apr 17 16:34:12.018480 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:12.018449 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-6svzp"] Apr 17 16:34:12.021723 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:34:12.021690 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dec519d_f277_4d63_80ab_1edcb2ec1275.slice/crio-72ef2f7c8066cb21207e76b4ae8114abedf57d1c6fb45afa277c3be58529cfcf WatchSource:0}: Error finding container 72ef2f7c8066cb21207e76b4ae8114abedf57d1c6fb45afa277c3be58529cfcf: Status 404 returned error can't find the container with id 72ef2f7c8066cb21207e76b4ae8114abedf57d1c6fb45afa277c3be58529cfcf Apr 17 16:34:12.427892 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:12.427857 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bnmth" event={"ID":"85668985-17d4-4c12-88cd-1204c9fd0791","Type":"ContainerStarted","Data":"f0ecd747bd51e8f5e2d670de3e09a78726660cb0ac2cca8de12758c03ddffd85"} Apr 17 16:34:12.428999 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:12.428974 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6svzp" event={"ID":"8dec519d-f277-4d63-80ab-1edcb2ec1275","Type":"ContainerStarted","Data":"72ef2f7c8066cb21207e76b4ae8114abedf57d1c6fb45afa277c3be58529cfcf"} Apr 17 16:34:12.447542 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:12.447480 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-bnmth" podStartSLOduration=1.2919064310000001 podStartE2EDuration="3.447464429s" podCreationTimestamp="2026-04-17 16:34:09 +0000 UTC" firstStartedPulling="2026-04-17 16:34:09.776660219 +0000 UTC m=+179.433467404" lastFinishedPulling="2026-04-17 16:34:11.932218211 +0000 UTC m=+181.589025402" observedRunningTime="2026-04-17 16:34:12.446827957 +0000 UTC m=+182.103635169" watchObservedRunningTime="2026-04-17 16:34:12.447464429 +0000 UTC m=+182.104271837" Apr 17 16:34:13.160703 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.160669 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-79b4b9b57f-5rxbd"] Apr 17 16:34:13.164270 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.164244 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.169318 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.169270 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 16:34:13.169461 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.169361 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-v5jmp\"" Apr 17 16:34:13.169461 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.169437 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 16:34:13.169623 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.169487 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 16:34:13.169623 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.169579 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 16:34:13.170058 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.170039 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 16:34:13.177100 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.177081 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79b4b9b57f-5rxbd"] Apr 17 16:34:13.259064 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.258976 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/11ad1a17-2e5f-492c-be31-afef2b0e6a94-console-oauth-config\") pod \"console-79b4b9b57f-5rxbd\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.259241 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.259103 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/11ad1a17-2e5f-492c-be31-afef2b0e6a94-oauth-serving-cert\") pod \"console-79b4b9b57f-5rxbd\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.259241 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.259147 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/11ad1a17-2e5f-492c-be31-afef2b0e6a94-console-config\") pod \"console-79b4b9b57f-5rxbd\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.259241 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.259201 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/11ad1a17-2e5f-492c-be31-afef2b0e6a94-console-serving-cert\") pod \"console-79b4b9b57f-5rxbd\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.259384 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.259263 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/11ad1a17-2e5f-492c-be31-afef2b0e6a94-service-ca\") pod \"console-79b4b9b57f-5rxbd\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.259384 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.259332 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv7pn\" (UniqueName: \"kubernetes.io/projected/11ad1a17-2e5f-492c-be31-afef2b0e6a94-kube-api-access-kv7pn\") pod \"console-79b4b9b57f-5rxbd\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.360623 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.360582 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/11ad1a17-2e5f-492c-be31-afef2b0e6a94-service-ca\") pod \"console-79b4b9b57f-5rxbd\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.360833 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.360672 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kv7pn\" (UniqueName: \"kubernetes.io/projected/11ad1a17-2e5f-492c-be31-afef2b0e6a94-kube-api-access-kv7pn\") pod \"console-79b4b9b57f-5rxbd\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.360833 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.360706 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/11ad1a17-2e5f-492c-be31-afef2b0e6a94-console-oauth-config\") pod \"console-79b4b9b57f-5rxbd\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.360833 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.360757 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/11ad1a17-2e5f-492c-be31-afef2b0e6a94-oauth-serving-cert\") pod \"console-79b4b9b57f-5rxbd\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.360833 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.360785 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/11ad1a17-2e5f-492c-be31-afef2b0e6a94-console-config\") pod \"console-79b4b9b57f-5rxbd\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.361058 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.360851 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/11ad1a17-2e5f-492c-be31-afef2b0e6a94-console-serving-cert\") pod \"console-79b4b9b57f-5rxbd\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.361369 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.361341 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/11ad1a17-2e5f-492c-be31-afef2b0e6a94-service-ca\") pod \"console-79b4b9b57f-5rxbd\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.362034 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.362006 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/11ad1a17-2e5f-492c-be31-afef2b0e6a94-console-config\") pod \"console-79b4b9b57f-5rxbd\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.362034 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.362034 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/11ad1a17-2e5f-492c-be31-afef2b0e6a94-oauth-serving-cert\") pod \"console-79b4b9b57f-5rxbd\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.363390 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.363367 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/11ad1a17-2e5f-492c-be31-afef2b0e6a94-console-oauth-config\") pod \"console-79b4b9b57f-5rxbd\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.363496 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.363476 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/11ad1a17-2e5f-492c-be31-afef2b0e6a94-console-serving-cert\") pod \"console-79b4b9b57f-5rxbd\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.370000 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.369979 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv7pn\" (UniqueName: \"kubernetes.io/projected/11ad1a17-2e5f-492c-be31-afef2b0e6a94-kube-api-access-kv7pn\") pod \"console-79b4b9b57f-5rxbd\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.432957 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.432905 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6svzp" event={"ID":"8dec519d-f277-4d63-80ab-1edcb2ec1275","Type":"ContainerStarted","Data":"2d3b765246a6472cfae1b770d98a2490e9d7ebe8056e45d58f58d000cfd3a389"} Apr 17 16:34:13.450263 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.450211 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6svzp" podStartSLOduration=33.53085934 podStartE2EDuration="34.450194695s" podCreationTimestamp="2026-04-17 16:33:39 +0000 UTC" firstStartedPulling="2026-04-17 16:34:12.025669743 +0000 UTC m=+181.682476928" lastFinishedPulling="2026-04-17 16:34:12.945005096 +0000 UTC m=+182.601812283" observedRunningTime="2026-04-17 16:34:13.449005265 +0000 UTC m=+183.105812475" watchObservedRunningTime="2026-04-17 16:34:13.450194695 +0000 UTC m=+183.107001903" Apr 17 16:34:13.474761 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.474727 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:13.616143 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:13.616112 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79b4b9b57f-5rxbd"] Apr 17 16:34:13.619743 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:34:13.619712 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11ad1a17_2e5f_492c_be31_afef2b0e6a94.slice/crio-eb40b093d04ba7d19fad78b5e041e55b00cb028a7217dd7c764a7d41280589d7 WatchSource:0}: Error finding container eb40b093d04ba7d19fad78b5e041e55b00cb028a7217dd7c764a7d41280589d7: Status 404 returned error can't find the container with id eb40b093d04ba7d19fad78b5e041e55b00cb028a7217dd7c764a7d41280589d7 Apr 17 16:34:14.404202 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:14.404167 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gqjm6"] Apr 17 16:34:14.407541 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:14.407488 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gqjm6" Apr 17 16:34:14.410093 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:14.409985 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 16:34:14.410093 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:14.410055 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-c9fpj\"" Apr 17 16:34:14.416915 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:14.416792 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gqjm6"] Apr 17 16:34:14.437505 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:14.437468 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79b4b9b57f-5rxbd" event={"ID":"11ad1a17-2e5f-492c-be31-afef2b0e6a94","Type":"ContainerStarted","Data":"eb40b093d04ba7d19fad78b5e041e55b00cb028a7217dd7c764a7d41280589d7"} Apr 17 16:34:14.471276 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:14.471247 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4608d06d-372e-4545-b3a1-6275d8b88c82-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-gqjm6\" (UID: \"4608d06d-372e-4545-b3a1-6275d8b88c82\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gqjm6" Apr 17 16:34:14.572130 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:14.572088 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4608d06d-372e-4545-b3a1-6275d8b88c82-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-gqjm6\" (UID: \"4608d06d-372e-4545-b3a1-6275d8b88c82\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gqjm6" Apr 17 16:34:14.572308 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:34:14.572257 2569 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 17 16:34:14.572381 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:34:14.572334 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4608d06d-372e-4545-b3a1-6275d8b88c82-tls-certificates podName:4608d06d-372e-4545-b3a1-6275d8b88c82 nodeName:}" failed. No retries permitted until 2026-04-17 16:34:15.07231413 +0000 UTC m=+184.729121321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/4608d06d-372e-4545-b3a1-6275d8b88c82-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-gqjm6" (UID: "4608d06d-372e-4545-b3a1-6275d8b88c82") : secret "prometheus-operator-admission-webhook-tls" not found Apr 17 16:34:15.076280 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:15.076209 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4608d06d-372e-4545-b3a1-6275d8b88c82-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-gqjm6\" (UID: \"4608d06d-372e-4545-b3a1-6275d8b88c82\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gqjm6" Apr 17 16:34:15.079215 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:15.079189 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4608d06d-372e-4545-b3a1-6275d8b88c82-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-gqjm6\" (UID: \"4608d06d-372e-4545-b3a1-6275d8b88c82\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gqjm6" Apr 17 16:34:15.321191 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:15.321149 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gqjm6" Apr 17 16:34:15.462301 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:15.462264 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gqjm6"] Apr 17 16:34:16.681564 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:34:16.681519 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4608d06d_372e_4545_b3a1_6275d8b88c82.slice/crio-497121abc60d8d474f590add5431be87cc9749c7da79ab4c516b0c03e6ba76ef WatchSource:0}: Error finding container 497121abc60d8d474f590add5431be87cc9749c7da79ab4c516b0c03e6ba76ef: Status 404 returned error can't find the container with id 497121abc60d8d474f590add5431be87cc9749c7da79ab4c516b0c03e6ba76ef Apr 17 16:34:17.447227 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:17.447165 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79b4b9b57f-5rxbd" event={"ID":"11ad1a17-2e5f-492c-be31-afef2b0e6a94","Type":"ContainerStarted","Data":"ad4156c2d33973212198b9b7e3d5358976b17408a08ef632c358e7e03153794d"} Apr 17 16:34:17.448397 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:17.448371 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gqjm6" event={"ID":"4608d06d-372e-4545-b3a1-6275d8b88c82","Type":"ContainerStarted","Data":"497121abc60d8d474f590add5431be87cc9749c7da79ab4c516b0c03e6ba76ef"} Apr 17 16:34:17.464617 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:17.464562 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79b4b9b57f-5rxbd" podStartSLOduration=1.354669011 podStartE2EDuration="4.464546759s" podCreationTimestamp="2026-04-17 16:34:13 +0000 UTC" firstStartedPulling="2026-04-17 16:34:13.621738104 +0000 UTC m=+183.278545295" lastFinishedPulling="2026-04-17 16:34:16.731615852 +0000 UTC m=+186.388423043" observedRunningTime="2026-04-17 16:34:17.46327607 +0000 UTC m=+187.120083284" watchObservedRunningTime="2026-04-17 16:34:17.464546759 +0000 UTC m=+187.121353967" Apr 17 16:34:18.452965 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:18.452915 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gqjm6" event={"ID":"4608d06d-372e-4545-b3a1-6275d8b88c82","Type":"ContainerStarted","Data":"2a4f324d2be32882535c013cf6207d8dff9c65e356c384822f440f0c8a8564b4"} Apr 17 16:34:18.453431 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:18.453370 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gqjm6" Apr 17 16:34:18.459838 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:18.459797 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gqjm6" Apr 17 16:34:18.470940 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:18.470888 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gqjm6" podStartSLOduration=3.398832559 podStartE2EDuration="4.470871931s" podCreationTimestamp="2026-04-17 16:34:14 +0000 UTC" firstStartedPulling="2026-04-17 16:34:16.683569642 +0000 UTC m=+186.340376832" lastFinishedPulling="2026-04-17 16:34:17.755609003 +0000 UTC m=+187.412416204" observedRunningTime="2026-04-17 16:34:18.468738506 +0000 UTC m=+188.125545716" watchObservedRunningTime="2026-04-17 16:34:18.470871931 +0000 UTC m=+188.127679140" Apr 17 16:34:20.934429 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:20.934398 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-764f6684cd-shspq"] Apr 17 16:34:20.938975 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:20.938939 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:20.946648 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:20.946624 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 16:34:20.948568 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:20.948524 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-764f6684cd-shspq"] Apr 17 16:34:21.035055 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.035020 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-console-serving-cert\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.035251 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.035082 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-console-oauth-config\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.035251 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.035144 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-oauth-serving-cert\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.035364 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.035328 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-service-ca\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.035456 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.035436 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-console-config\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.035537 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.035499 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq5dj\" (UniqueName: \"kubernetes.io/projected/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-kube-api-access-nq5dj\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.035594 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.035544 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-trusted-ca-bundle\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.136446 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.136404 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-service-ca\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.136626 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.136471 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-console-config\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.136626 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.136580 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nq5dj\" (UniqueName: \"kubernetes.io/projected/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-kube-api-access-nq5dj\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.136756 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.136630 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-trusted-ca-bundle\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.136756 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.136692 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-console-serving-cert\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.136872 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.136852 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-console-oauth-config\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.136937 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.136894 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-oauth-serving-cert\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.137324 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.137298 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-console-config\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.137447 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.137303 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-service-ca\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.137768 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.137737 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-oauth-serving-cert\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.137991 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.137970 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-trusted-ca-bundle\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.139623 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.139585 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-console-oauth-config\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.139844 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.139827 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-console-serving-cert\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.145427 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.145392 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq5dj\" (UniqueName: \"kubernetes.io/projected/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-kube-api-access-nq5dj\") pod \"console-764f6684cd-shspq\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:21.251227 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:21.251139 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:23.476119 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.475835 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:23.476119 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.475914 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:23.481239 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.481215 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:23.937797 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.937758 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r"] Apr 17 16:34:23.943046 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.943021 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" Apr 17 16:34:23.948457 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.948430 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 16:34:23.948573 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.948509 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 16:34:23.949245 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.949120 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 16:34:23.949245 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.949181 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-jv48v\"" Apr 17 16:34:23.949245 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.949194 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 16:34:23.949439 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.949195 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 16:34:23.961557 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.961531 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r"] Apr 17 16:34:23.962942 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.962921 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-cb655"] Apr 17 16:34:23.966429 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.966408 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qzpx4"] Apr 17 16:34:23.966608 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.966588 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:23.969726 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.969706 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:23.969919 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.969897 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 16:34:23.970338 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.970297 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-l9jpz\"" Apr 17 16:34:23.970445 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.970385 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 16:34:23.970640 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.970621 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 16:34:23.971997 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.971970 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 16:34:23.972348 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.972316 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 16:34:23.972348 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.972339 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 16:34:23.972485 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.972405 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hr7mt\"" Apr 17 16:34:23.977149 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:23.977127 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-cb655"] Apr 17 16:34:24.063736 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.063700 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2c1b1707-f1e8-419a-bb77-e8d261512299-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5dn2r\" (UID: \"2c1b1707-f1e8-419a-bb77-e8d261512299\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" Apr 17 16:34:24.063949 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.063744 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/73c01fa2-0bb5-4830-8566-287275e3788e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.063949 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.063769 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msfjn\" (UniqueName: \"kubernetes.io/projected/73c01fa2-0bb5-4830-8566-287275e3788e-kube-api-access-msfjn\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.063949 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.063795 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-accelerators-collector-config\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.063949 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.063847 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n272p\" (UniqueName: \"kubernetes.io/projected/e9bdf773-6e19-473e-91a4-6e5e975799cc-kube-api-access-n272p\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.063949 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.063882 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2c1b1707-f1e8-419a-bb77-e8d261512299-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5dn2r\" (UID: \"2c1b1707-f1e8-419a-bb77-e8d261512299\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" Apr 17 16:34:24.063949 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.063922 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-textfile\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.064211 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.063955 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e9bdf773-6e19-473e-91a4-6e5e975799cc-root\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.064211 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.063983 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/73c01fa2-0bb5-4830-8566-287275e3788e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.064211 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.064007 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-wtmp\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.064211 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.064028 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73c01fa2-0bb5-4830-8566-287275e3788e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.064211 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.064052 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-tls\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.064211 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.064079 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/73c01fa2-0bb5-4830-8566-287275e3788e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.064211 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.064169 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9bdf773-6e19-473e-91a4-6e5e975799cc-sys\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.064515 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.064217 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.064515 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.064252 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c1b1707-f1e8-419a-bb77-e8d261512299-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5dn2r\" (UID: \"2c1b1707-f1e8-419a-bb77-e8d261512299\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" Apr 17 16:34:24.064515 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.064282 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hf58\" (UniqueName: \"kubernetes.io/projected/2c1b1707-f1e8-419a-bb77-e8d261512299-kube-api-access-7hf58\") pod \"openshift-state-metrics-9d44df66c-5dn2r\" (UID: \"2c1b1707-f1e8-419a-bb77-e8d261512299\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" Apr 17 16:34:24.064515 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.064393 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9bdf773-6e19-473e-91a4-6e5e975799cc-metrics-client-ca\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.064515 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.064430 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/73c01fa2-0bb5-4830-8566-287275e3788e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.165717 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.165673 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/73c01fa2-0bb5-4830-8566-287275e3788e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.165952 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.165736 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2c1b1707-f1e8-419a-bb77-e8d261512299-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5dn2r\" (UID: \"2c1b1707-f1e8-419a-bb77-e8d261512299\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" Apr 17 16:34:24.165952 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.165762 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/73c01fa2-0bb5-4830-8566-287275e3788e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.165952 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.165781 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-msfjn\" (UniqueName: \"kubernetes.io/projected/73c01fa2-0bb5-4830-8566-287275e3788e-kube-api-access-msfjn\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.165952 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.165834 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-accelerators-collector-config\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.165952 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:34:24.165864 2569 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 16:34:24.165952 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.165876 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n272p\" (UniqueName: \"kubernetes.io/projected/e9bdf773-6e19-473e-91a4-6e5e975799cc-kube-api-access-n272p\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.165952 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.165918 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2c1b1707-f1e8-419a-bb77-e8d261512299-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5dn2r\" (UID: \"2c1b1707-f1e8-419a-bb77-e8d261512299\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" Apr 17 16:34:24.165952 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.165947 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-textfile\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.166289 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:34:24.165973 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73c01fa2-0bb5-4830-8566-287275e3788e-kube-state-metrics-tls podName:73c01fa2-0bb5-4830-8566-287275e3788e nodeName:}" failed. No retries permitted until 2026-04-17 16:34:24.665951975 +0000 UTC m=+194.322759161 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/73c01fa2-0bb5-4830-8566-287275e3788e-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-cb655" (UID: "73c01fa2-0bb5-4830-8566-287275e3788e") : secret "kube-state-metrics-tls" not found Apr 17 16:34:24.166289 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.166018 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e9bdf773-6e19-473e-91a4-6e5e975799cc-root\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.166289 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.166048 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/73c01fa2-0bb5-4830-8566-287275e3788e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.166289 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.166080 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-wtmp\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.166289 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.166115 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73c01fa2-0bb5-4830-8566-287275e3788e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.166289 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.166147 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-tls\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.166289 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.166159 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e9bdf773-6e19-473e-91a4-6e5e975799cc-root\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.166289 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.166177 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/73c01fa2-0bb5-4830-8566-287275e3788e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.166289 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.166210 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9bdf773-6e19-473e-91a4-6e5e975799cc-sys\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.166289 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.166235 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/73c01fa2-0bb5-4830-8566-287275e3788e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.166289 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.166248 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.166289 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.166286 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c1b1707-f1e8-419a-bb77-e8d261512299-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5dn2r\" (UID: \"2c1b1707-f1e8-419a-bb77-e8d261512299\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" Apr 17 16:34:24.166880 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.166321 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hf58\" (UniqueName: \"kubernetes.io/projected/2c1b1707-f1e8-419a-bb77-e8d261512299-kube-api-access-7hf58\") pod \"openshift-state-metrics-9d44df66c-5dn2r\" (UID: \"2c1b1707-f1e8-419a-bb77-e8d261512299\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" Apr 17 16:34:24.166880 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.166413 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9bdf773-6e19-473e-91a4-6e5e975799cc-metrics-client-ca\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.166880 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.166733 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-accelerators-collector-config\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.166880 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:34:24.166828 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 16:34:24.166880 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:34:24.166877 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-tls podName:e9bdf773-6e19-473e-91a4-6e5e975799cc nodeName:}" failed. No retries permitted until 2026-04-17 16:34:24.666860475 +0000 UTC m=+194.323667675 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-tls") pod "node-exporter-qzpx4" (UID: "e9bdf773-6e19-473e-91a4-6e5e975799cc") : secret "node-exporter-tls" not found Apr 17 16:34:24.167135 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.166939 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2c1b1707-f1e8-419a-bb77-e8d261512299-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5dn2r\" (UID: \"2c1b1707-f1e8-419a-bb77-e8d261512299\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" Apr 17 16:34:24.167135 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.166985 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9bdf773-6e19-473e-91a4-6e5e975799cc-metrics-client-ca\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.167135 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.167004 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-wtmp\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.167135 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.166213 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-textfile\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.167135 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.167012 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9bdf773-6e19-473e-91a4-6e5e975799cc-sys\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.167381 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.167324 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/73c01fa2-0bb5-4830-8566-287275e3788e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.167604 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.167555 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73c01fa2-0bb5-4830-8566-287275e3788e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.169290 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.169246 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2c1b1707-f1e8-419a-bb77-e8d261512299-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5dn2r\" (UID: \"2c1b1707-f1e8-419a-bb77-e8d261512299\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" Apr 17 16:34:24.169412 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.169386 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.169471 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.169396 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/73c01fa2-0bb5-4830-8566-287275e3788e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.169528 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.169505 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c1b1707-f1e8-419a-bb77-e8d261512299-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5dn2r\" (UID: \"2c1b1707-f1e8-419a-bb77-e8d261512299\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" Apr 17 16:34:24.177624 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.177562 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hf58\" (UniqueName: \"kubernetes.io/projected/2c1b1707-f1e8-419a-bb77-e8d261512299-kube-api-access-7hf58\") pod \"openshift-state-metrics-9d44df66c-5dn2r\" (UID: \"2c1b1707-f1e8-419a-bb77-e8d261512299\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" Apr 17 16:34:24.178170 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.178137 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n272p\" (UniqueName: \"kubernetes.io/projected/e9bdf773-6e19-473e-91a4-6e5e975799cc-kube-api-access-n272p\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.178313 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.178236 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-msfjn\" (UniqueName: \"kubernetes.io/projected/73c01fa2-0bb5-4830-8566-287275e3788e-kube-api-access-msfjn\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.254160 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.253945 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" Apr 17 16:34:24.472904 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.472864 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:34:24.671452 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.671410 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-tls\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:24.671962 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.671521 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/73c01fa2-0bb5-4830-8566-287275e3788e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.671962 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:34:24.671591 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 16:34:24.671962 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:34:24.671671 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-tls podName:e9bdf773-6e19-473e-91a4-6e5e975799cc nodeName:}" failed. No retries permitted until 2026-04-17 16:34:25.67165136 +0000 UTC m=+195.328458562 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-tls") pod "node-exporter-qzpx4" (UID: "e9bdf773-6e19-473e-91a4-6e5e975799cc") : secret "node-exporter-tls" not found Apr 17 16:34:24.674280 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.674257 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/73c01fa2-0bb5-4830-8566-287275e3788e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-cb655\" (UID: \"73c01fa2-0bb5-4830-8566-287275e3788e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:24.879597 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:24.879558 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" Apr 17 16:34:25.680239 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:25.680201 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-tls\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:25.682925 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:25.682895 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e9bdf773-6e19-473e-91a4-6e5e975799cc-node-exporter-tls\") pod \"node-exporter-qzpx4\" (UID: \"e9bdf773-6e19-473e-91a4-6e5e975799cc\") " pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:25.786510 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:25.786454 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qzpx4" Apr 17 16:34:25.906963 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:25.906789 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r"] Apr 17 16:34:25.909201 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:34:25.909175 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c1b1707_f1e8_419a_bb77_e8d261512299.slice/crio-3c948b79db59eb2a64ee0172bb4dd6730aa640722ee39496a92f725e17e71b3b WatchSource:0}: Error finding container 3c948b79db59eb2a64ee0172bb4dd6730aa640722ee39496a92f725e17e71b3b: Status 404 returned error can't find the container with id 3c948b79db59eb2a64ee0172bb4dd6730aa640722ee39496a92f725e17e71b3b Apr 17 16:34:26.160054 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:26.159963 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-764f6684cd-shspq"] Apr 17 16:34:26.160054 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:34:26.160030 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7735eef9_d6d0_40ea_85e5_fa65b68a0dc6.slice/crio-91b5d5817dffd1edde8d9083b9cc4fa1d457c195d6bf58f77314fa07ca6d651b WatchSource:0}: Error finding container 91b5d5817dffd1edde8d9083b9cc4fa1d457c195d6bf58f77314fa07ca6d651b: Status 404 returned error can't find the container with id 91b5d5817dffd1edde8d9083b9cc4fa1d457c195d6bf58f77314fa07ca6d651b Apr 17 16:34:26.175738 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:26.175694 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-cb655"] Apr 17 16:34:26.180018 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:34:26.179978 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73c01fa2_0bb5_4830_8566_287275e3788e.slice/crio-891f4f5aeda08bdc2f3bbfee2f7de4a35827eae50b53bc19a85e1ae2ebedf2b4 WatchSource:0}: Error finding container 891f4f5aeda08bdc2f3bbfee2f7de4a35827eae50b53bc19a85e1ae2ebedf2b4: Status 404 returned error can't find the container with id 891f4f5aeda08bdc2f3bbfee2f7de4a35827eae50b53bc19a85e1ae2ebedf2b4 Apr 17 16:34:26.483321 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:26.482056 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-wn6rg" event={"ID":"f8e80855-0ca0-4413-990c-7bfef055921c","Type":"ContainerStarted","Data":"fb9a36bccaf575b85e6f9c49f76b3e698e09124f3d90b7ddc8ebb886b207ba7e"} Apr 17 16:34:26.483321 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:26.483131 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-wn6rg" Apr 17 16:34:26.486123 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:26.486070 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" event={"ID":"73c01fa2-0bb5-4830-8566-287275e3788e","Type":"ContainerStarted","Data":"891f4f5aeda08bdc2f3bbfee2f7de4a35827eae50b53bc19a85e1ae2ebedf2b4"} Apr 17 16:34:26.488087 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:26.488055 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qzpx4" event={"ID":"e9bdf773-6e19-473e-91a4-6e5e975799cc","Type":"ContainerStarted","Data":"7c7151777160c52b1e3c116d1a55498759a6e78f62faba06675039bd80a753e9"} Apr 17 16:34:26.490030 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:26.489887 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764f6684cd-shspq" event={"ID":"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6","Type":"ContainerStarted","Data":"ddeb9a2a7717a0e1efb01bdcbf592afdbb0b51dd433f5ce0d53da02a587e8734"} Apr 17 16:34:26.490030 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:26.489922 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764f6684cd-shspq" event={"ID":"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6","Type":"ContainerStarted","Data":"91b5d5817dffd1edde8d9083b9cc4fa1d457c195d6bf58f77314fa07ca6d651b"} Apr 17 16:34:26.492419 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:26.492352 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" event={"ID":"2c1b1707-f1e8-419a-bb77-e8d261512299","Type":"ContainerStarted","Data":"a46d9a4a1e8aa869418f7f75a9acda9bc1c67852a9a9773eaaa50603cb08a7a9"} Apr 17 16:34:26.492419 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:26.492383 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" event={"ID":"2c1b1707-f1e8-419a-bb77-e8d261512299","Type":"ContainerStarted","Data":"c25828b7b2da530f5a1fba6c8fdb9ef90b22dadd448bf8c23ceda02251226e8c"} Apr 17 16:34:26.492419 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:26.492397 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" event={"ID":"2c1b1707-f1e8-419a-bb77-e8d261512299","Type":"ContainerStarted","Data":"3c948b79db59eb2a64ee0172bb4dd6730aa640722ee39496a92f725e17e71b3b"} Apr 17 16:34:26.498471 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:26.498441 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-wn6rg" Apr 17 16:34:26.514148 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:26.514088 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-wn6rg" podStartSLOduration=1.337174241 podStartE2EDuration="17.51407186s" podCreationTimestamp="2026-04-17 16:34:09 +0000 UTC" firstStartedPulling="2026-04-17 16:34:09.684977408 +0000 UTC m=+179.341784601" lastFinishedPulling="2026-04-17 16:34:25.861875025 +0000 UTC m=+195.518682220" observedRunningTime="2026-04-17 16:34:26.512958717 +0000 UTC m=+196.169765928" watchObservedRunningTime="2026-04-17 16:34:26.51407186 +0000 UTC m=+196.170879069" Apr 17 16:34:26.604257 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:26.602786 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-764f6684cd-shspq" podStartSLOduration=6.602765411 podStartE2EDuration="6.602765411s" podCreationTimestamp="2026-04-17 16:34:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:34:26.600517829 +0000 UTC m=+196.257325039" watchObservedRunningTime="2026-04-17 16:34:26.602765411 +0000 UTC m=+196.259572634" Apr 17 16:34:27.497437 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:27.497153 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qzpx4" event={"ID":"e9bdf773-6e19-473e-91a4-6e5e975799cc","Type":"ContainerStarted","Data":"387222c5fa927d2321005bb531f49f90be090854084f3cc66cdc9d8d5e5b8b8c"} Apr 17 16:34:27.500456 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:27.500413 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" event={"ID":"2c1b1707-f1e8-419a-bb77-e8d261512299","Type":"ContainerStarted","Data":"31d3f1533384b1eb1d4a5fd2790ce3e81c372b500dceec19732a30c1d9644888"} Apr 17 16:34:27.556425 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:27.556368 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5dn2r" podStartSLOduration=3.523492819 podStartE2EDuration="4.556347639s" podCreationTimestamp="2026-04-17 16:34:23 +0000 UTC" firstStartedPulling="2026-04-17 16:34:26.041655123 +0000 UTC m=+195.698462312" lastFinishedPulling="2026-04-17 16:34:27.074509943 +0000 UTC m=+196.731317132" observedRunningTime="2026-04-17 16:34:27.554831542 +0000 UTC m=+197.211638750" watchObservedRunningTime="2026-04-17 16:34:27.556347639 +0000 UTC m=+197.213154847" Apr 17 16:34:28.506543 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:28.506496 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" event={"ID":"73c01fa2-0bb5-4830-8566-287275e3788e","Type":"ContainerStarted","Data":"d315a74ae9fd02b820a6aff8ba2c0dbc5455a29998a608f3c01ccce202c45aa0"} Apr 17 16:34:28.506543 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:28.506545 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" event={"ID":"73c01fa2-0bb5-4830-8566-287275e3788e","Type":"ContainerStarted","Data":"0b3f228b3837c1e33de1be48fa74ff5c54cf4150b2df147ec945a53d1e6119e5"} Apr 17 16:34:28.507062 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:28.506559 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" event={"ID":"73c01fa2-0bb5-4830-8566-287275e3788e","Type":"ContainerStarted","Data":"0558d733c54475301661554c6ac3af2ac0ca9849c3d0879603a25dd3b216558f"} Apr 17 16:34:28.508007 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:28.507977 2569 generic.go:358] "Generic (PLEG): container finished" podID="e9bdf773-6e19-473e-91a4-6e5e975799cc" containerID="387222c5fa927d2321005bb531f49f90be090854084f3cc66cdc9d8d5e5b8b8c" exitCode=0 Apr 17 16:34:28.508136 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:28.508073 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qzpx4" event={"ID":"e9bdf773-6e19-473e-91a4-6e5e975799cc","Type":"ContainerDied","Data":"387222c5fa927d2321005bb531f49f90be090854084f3cc66cdc9d8d5e5b8b8c"} Apr 17 16:34:28.545657 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:28.545602 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-cb655" podStartSLOduration=4.123323216 podStartE2EDuration="5.545582871s" podCreationTimestamp="2026-04-17 16:34:23 +0000 UTC" firstStartedPulling="2026-04-17 16:34:26.182377209 +0000 UTC m=+195.839184399" lastFinishedPulling="2026-04-17 16:34:27.60463685 +0000 UTC m=+197.261444054" observedRunningTime="2026-04-17 16:34:28.543882273 +0000 UTC m=+198.200689508" watchObservedRunningTime="2026-04-17 16:34:28.545582871 +0000 UTC m=+198.202390080" Apr 17 16:34:28.722943 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:28.722914 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8stz2"] Apr 17 16:34:28.758260 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:28.758193 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8stz2"] Apr 17 16:34:28.758404 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:28.758339 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8stz2" Apr 17 16:34:28.762492 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:28.762465 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 16:34:28.762623 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:28.762606 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-lcgdv\"" Apr 17 16:34:28.912370 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:28.912303 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb81ba0c-4f9c-47f0-be8d-5c88e55e8ceb-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8stz2\" (UID: \"eb81ba0c-4f9c-47f0-be8d-5c88e55e8ceb\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8stz2" Apr 17 16:34:29.013138 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:29.013095 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb81ba0c-4f9c-47f0-be8d-5c88e55e8ceb-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8stz2\" (UID: \"eb81ba0c-4f9c-47f0-be8d-5c88e55e8ceb\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8stz2" Apr 17 16:34:29.013324 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:34:29.013256 2569 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 16:34:29.013386 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:34:29.013325 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81ba0c-4f9c-47f0-be8d-5c88e55e8ceb-monitoring-plugin-cert podName:eb81ba0c-4f9c-47f0-be8d-5c88e55e8ceb nodeName:}" failed. No retries permitted until 2026-04-17 16:34:29.513306577 +0000 UTC m=+199.170113787 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/eb81ba0c-4f9c-47f0-be8d-5c88e55e8ceb-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-8stz2" (UID: "eb81ba0c-4f9c-47f0-be8d-5c88e55e8ceb") : secret "monitoring-plugin-cert" not found Apr 17 16:34:29.515184 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:29.515145 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qzpx4" event={"ID":"e9bdf773-6e19-473e-91a4-6e5e975799cc","Type":"ContainerStarted","Data":"f5246ad326c0e5ed8d528e2c32a277ed69e9e638462e7980281c3fd90a9598f8"} Apr 17 16:34:29.515611 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:29.515195 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qzpx4" event={"ID":"e9bdf773-6e19-473e-91a4-6e5e975799cc","Type":"ContainerStarted","Data":"eb104b6b94697a1a3f223ddb956beaa1a4302edaf727bd3b54ebf5d70eaec832"} Apr 17 16:34:29.519093 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:29.519062 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb81ba0c-4f9c-47f0-be8d-5c88e55e8ceb-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8stz2\" (UID: \"eb81ba0c-4f9c-47f0-be8d-5c88e55e8ceb\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8stz2" Apr 17 16:34:29.521717 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:29.521694 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb81ba0c-4f9c-47f0-be8d-5c88e55e8ceb-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8stz2\" (UID: \"eb81ba0c-4f9c-47f0-be8d-5c88e55e8ceb\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8stz2" Apr 17 16:34:29.538688 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:29.538633 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qzpx4" podStartSLOduration=5.680823394 podStartE2EDuration="6.538612599s" podCreationTimestamp="2026-04-17 16:34:23 +0000 UTC" firstStartedPulling="2026-04-17 16:34:25.811268521 +0000 UTC m=+195.468075708" lastFinishedPulling="2026-04-17 16:34:26.669057716 +0000 UTC m=+196.325864913" observedRunningTime="2026-04-17 16:34:29.536644206 +0000 UTC m=+199.193451453" watchObservedRunningTime="2026-04-17 16:34:29.538612599 +0000 UTC m=+199.195419808" Apr 17 16:34:29.669527 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:29.669486 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8stz2" Apr 17 16:34:29.808192 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:29.808158 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8stz2"] Apr 17 16:34:29.810882 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:34:29.810850 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb81ba0c_4f9c_47f0_be8d_5c88e55e8ceb.slice/crio-96b223bf602f862cadaca445af297f4b0da2eb86e6ec69179da101b484620111 WatchSource:0}: Error finding container 96b223bf602f862cadaca445af297f4b0da2eb86e6ec69179da101b484620111: Status 404 returned error can't find the container with id 96b223bf602f862cadaca445af297f4b0da2eb86e6ec69179da101b484620111 Apr 17 16:34:30.374145 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.374102 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-764f6684cd-shspq"] Apr 17 16:34:30.408502 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.408467 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d5c757cd7-5cbvx"] Apr 17 16:34:30.428974 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.428941 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d5c757cd7-5cbvx"] Apr 17 16:34:30.429144 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.429086 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.459935 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.459881 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:34:30.482917 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.482032 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.485047 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.485020 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 16:34:30.485333 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.485316 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 16:34:30.485648 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.485627 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 16:34:30.486036 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.486013 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 16:34:30.486908 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.486888 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 16:34:30.487255 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.487148 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-775bm0no2uttm\"" Apr 17 16:34:30.487459 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.487374 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 16:34:30.487587 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.487567 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 16:34:30.487953 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.487732 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 16:34:30.487953 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.487766 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 16:34:30.488100 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.487993 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 16:34:30.488100 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.488009 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 16:34:30.488848 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.488340 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-qx5w8\"" Apr 17 16:34:30.488848 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.488384 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 16:34:30.490365 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.490044 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 16:34:30.492988 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.491678 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:34:30.522562 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.522518 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8stz2" event={"ID":"eb81ba0c-4f9c-47f0-be8d-5c88e55e8ceb","Type":"ContainerStarted","Data":"96b223bf602f862cadaca445af297f4b0da2eb86e6ec69179da101b484620111"} Apr 17 16:34:30.528689 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.528662 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/087770ad-46ca-429b-9fb8-a24f6cdc1d69-console-oauth-config\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.528870 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.528727 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j69th\" (UniqueName: \"kubernetes.io/projected/087770ad-46ca-429b-9fb8-a24f6cdc1d69-kube-api-access-j69th\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.528870 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.528757 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-trusted-ca-bundle\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.528870 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.528782 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-console-config\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.529034 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.528900 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-service-ca\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.529034 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.528949 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/087770ad-46ca-429b-9fb8-a24f6cdc1d69-console-serving-cert\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.529034 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.528975 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-oauth-serving-cert\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.630692 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.630307 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.630692 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.630452 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-config\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.630692 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.630499 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.630692 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.630540 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.630692 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.630576 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-service-ca\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.630692 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.630615 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.630692 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.630644 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.630692 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.630667 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.630692 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.630696 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.631509 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.630713 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.631509 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.630731 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.631509 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.630794 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/087770ad-46ca-429b-9fb8-a24f6cdc1d69-console-serving-cert\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.631509 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.630870 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-oauth-serving-cert\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.631509 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.630927 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-config-out\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.631509 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.630956 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-web-config\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.631509 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.630999 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/087770ad-46ca-429b-9fb8-a24f6cdc1d69-console-oauth-config\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.631509 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.631028 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.631509 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.631142 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j69th\" (UniqueName: \"kubernetes.io/projected/087770ad-46ca-429b-9fb8-a24f6cdc1d69-kube-api-access-j69th\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.631509 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.631173 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-trusted-ca-bundle\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.631509 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.631201 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-console-config\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.631509 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.631226 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.631509 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.631285 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.631509 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.631338 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdhgr\" (UniqueName: \"kubernetes.io/projected/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-kube-api-access-kdhgr\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.631509 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.631359 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-service-ca\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.631509 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.631371 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.632306 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.631422 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.632306 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.631541 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-oauth-serving-cert\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.632306 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.632190 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-trusted-ca-bundle\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.632452 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.632377 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-console-config\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.634726 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.634686 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/087770ad-46ca-429b-9fb8-a24f6cdc1d69-console-oauth-config\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.635234 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.635192 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/087770ad-46ca-429b-9fb8-a24f6cdc1d69-console-serving-cert\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.640957 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.640915 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j69th\" (UniqueName: \"kubernetes.io/projected/087770ad-46ca-429b-9fb8-a24f6cdc1d69-kube-api-access-j69th\") pod \"console-6d5c757cd7-5cbvx\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.732300 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.732238 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.732300 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.732298 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.732625 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.732333 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.732625 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.732360 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.732625 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.732386 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.732625 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.732415 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.732625 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.732441 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.732625 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.732469 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.732625 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.732533 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-config-out\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.732625 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.732562 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-web-config\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.732625 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.732600 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.733149 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.732665 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.733149 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.732710 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.733149 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.732748 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdhgr\" (UniqueName: \"kubernetes.io/projected/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-kube-api-access-kdhgr\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.733149 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.732779 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.733149 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.732836 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.733149 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.732872 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.733149 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.732940 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-config\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.733566 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.733199 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.734065 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.734035 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.735743 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.734650 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.736555 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.736524 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.736658 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.736593 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.737136 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.737092 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.737221 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.737181 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.738147 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.738124 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.739681 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.739636 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.741324 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.741283 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.742990 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.742533 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:30.744126 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.744073 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-web-config\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.745318 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.745291 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-config-out\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.745470 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.745450 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.745470 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.745463 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.745654 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.745633 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.745891 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.745788 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-config\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.746874 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.746841 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.749438 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.749292 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdhgr\" (UniqueName: \"kubernetes.io/projected/28c4d7cd-c381-4e90-8fc2-82cbb0c46b07-kube-api-access-kdhgr\") pod \"prometheus-k8s-0\" (UID: \"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.795041 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.795006 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:30.919455 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.919424 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d5c757cd7-5cbvx"] Apr 17 16:34:30.934439 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:34:30.934403 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod087770ad_46ca_429b_9fb8_a24f6cdc1d69.slice/crio-00c8d1119273036377e276f23f165fcc6e715b1b40a20e6b7d62c14068c19cca WatchSource:0}: Error finding container 00c8d1119273036377e276f23f165fcc6e715b1b40a20e6b7d62c14068c19cca: Status 404 returned error can't find the container with id 00c8d1119273036377e276f23f165fcc6e715b1b40a20e6b7d62c14068c19cca Apr 17 16:34:30.993175 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:30.993142 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:34:30.998090 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:34:30.998057 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28c4d7cd_c381_4e90_8fc2_82cbb0c46b07.slice/crio-546f61ce83afadbe34f76a9d36cb9566df8648a8f9cb1ddfdc7230b8242daac2 WatchSource:0}: Error finding container 546f61ce83afadbe34f76a9d36cb9566df8648a8f9cb1ddfdc7230b8242daac2: Status 404 returned error can't find the container with id 546f61ce83afadbe34f76a9d36cb9566df8648a8f9cb1ddfdc7230b8242daac2 Apr 17 16:34:31.251738 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:31.251653 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:31.527821 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:31.527767 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d5c757cd7-5cbvx" event={"ID":"087770ad-46ca-429b-9fb8-a24f6cdc1d69","Type":"ContainerStarted","Data":"2805f1500d356bee21d798ebf8e035b751f68077434fd8e52fe13504842c9aee"} Apr 17 16:34:31.528283 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:31.527829 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d5c757cd7-5cbvx" event={"ID":"087770ad-46ca-429b-9fb8-a24f6cdc1d69","Type":"ContainerStarted","Data":"00c8d1119273036377e276f23f165fcc6e715b1b40a20e6b7d62c14068c19cca"} Apr 17 16:34:31.528988 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:31.528958 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07","Type":"ContainerStarted","Data":"546f61ce83afadbe34f76a9d36cb9566df8648a8f9cb1ddfdc7230b8242daac2"} Apr 17 16:34:31.546870 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:31.546824 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d5c757cd7-5cbvx" podStartSLOduration=1.546788879 podStartE2EDuration="1.546788879s" podCreationTimestamp="2026-04-17 16:34:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:34:31.545624868 +0000 UTC m=+201.202432102" watchObservedRunningTime="2026-04-17 16:34:31.546788879 +0000 UTC m=+201.203596087" Apr 17 16:34:31.681276 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:31.681243 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-786846df7d-8q4xm"] Apr 17 16:34:31.681566 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:34:31.681539 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-786846df7d-8q4xm" podUID="eab9531b-0bb9-4c09-812f-28f31cb4253e" Apr 17 16:34:32.536285 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.536231 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8stz2" event={"ID":"eb81ba0c-4f9c-47f0-be8d-5c88e55e8ceb","Type":"ContainerStarted","Data":"d159203d5198280bd4c1ad4f547920495cee1f283b69df7b09e4395f5535f6c4"} Apr 17 16:34:32.536285 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.536250 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:34:32.536910 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.536602 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8stz2" Apr 17 16:34:32.541967 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.541940 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:34:32.542263 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.542244 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8stz2" Apr 17 16:34:32.552917 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.552884 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eab9531b-0bb9-4c09-812f-28f31cb4253e-installation-pull-secrets\") pod \"eab9531b-0bb9-4c09-812f-28f31cb4253e\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " Apr 17 16:34:32.553070 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.552997 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eab9531b-0bb9-4c09-812f-28f31cb4253e-ca-trust-extracted\") pod \"eab9531b-0bb9-4c09-812f-28f31cb4253e\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " Apr 17 16:34:32.553070 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.553035 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-bound-sa-token\") pod \"eab9531b-0bb9-4c09-812f-28f31cb4253e\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " Apr 17 16:34:32.553187 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.553071 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-certificates\") pod \"eab9531b-0bb9-4c09-812f-28f31cb4253e\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " Apr 17 16:34:32.553187 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.553096 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgtfg\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-kube-api-access-cgtfg\") pod \"eab9531b-0bb9-4c09-812f-28f31cb4253e\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " Apr 17 16:34:32.553187 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.553128 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eab9531b-0bb9-4c09-812f-28f31cb4253e-trusted-ca\") pod \"eab9531b-0bb9-4c09-812f-28f31cb4253e\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " Apr 17 16:34:32.553187 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.553157 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eab9531b-0bb9-4c09-812f-28f31cb4253e-image-registry-private-configuration\") pod \"eab9531b-0bb9-4c09-812f-28f31cb4253e\" (UID: \"eab9531b-0bb9-4c09-812f-28f31cb4253e\") " Apr 17 16:34:32.553388 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.553306 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eab9531b-0bb9-4c09-812f-28f31cb4253e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "eab9531b-0bb9-4c09-812f-28f31cb4253e" (UID: "eab9531b-0bb9-4c09-812f-28f31cb4253e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:34:32.553388 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.553366 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "eab9531b-0bb9-4c09-812f-28f31cb4253e" (UID: "eab9531b-0bb9-4c09-812f-28f31cb4253e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:32.554652 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.553672 2569 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eab9531b-0bb9-4c09-812f-28f31cb4253e-ca-trust-extracted\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:32.554652 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.553693 2569 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-certificates\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:32.554652 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.554333 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eab9531b-0bb9-4c09-812f-28f31cb4253e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "eab9531b-0bb9-4c09-812f-28f31cb4253e" (UID: "eab9531b-0bb9-4c09-812f-28f31cb4253e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:32.554972 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.554915 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8stz2" podStartSLOduration=2.514597347 podStartE2EDuration="4.55489846s" podCreationTimestamp="2026-04-17 16:34:28 +0000 UTC" firstStartedPulling="2026-04-17 16:34:29.813044006 +0000 UTC m=+199.469851198" lastFinishedPulling="2026-04-17 16:34:31.853345114 +0000 UTC m=+201.510152311" observedRunningTime="2026-04-17 16:34:32.554315695 +0000 UTC m=+202.211122905" watchObservedRunningTime="2026-04-17 16:34:32.55489846 +0000 UTC m=+202.211705672" Apr 17 16:34:32.556208 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.556163 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab9531b-0bb9-4c09-812f-28f31cb4253e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "eab9531b-0bb9-4c09-812f-28f31cb4253e" (UID: "eab9531b-0bb9-4c09-812f-28f31cb4253e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:32.556481 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.556460 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "eab9531b-0bb9-4c09-812f-28f31cb4253e" (UID: "eab9531b-0bb9-4c09-812f-28f31cb4253e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:32.556642 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.556611 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab9531b-0bb9-4c09-812f-28f31cb4253e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "eab9531b-0bb9-4c09-812f-28f31cb4253e" (UID: "eab9531b-0bb9-4c09-812f-28f31cb4253e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:32.557266 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.557238 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-kube-api-access-cgtfg" (OuterVolumeSpecName: "kube-api-access-cgtfg") pod "eab9531b-0bb9-4c09-812f-28f31cb4253e" (UID: "eab9531b-0bb9-4c09-812f-28f31cb4253e"). InnerVolumeSpecName "kube-api-access-cgtfg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:32.654489 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.654451 2569 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-bound-sa-token\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:32.654489 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.654488 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cgtfg\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-kube-api-access-cgtfg\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:32.654489 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.654499 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eab9531b-0bb9-4c09-812f-28f31cb4253e-trusted-ca\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:32.654748 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.654510 2569 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eab9531b-0bb9-4c09-812f-28f31cb4253e-image-registry-private-configuration\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:32.654748 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:32.654527 2569 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eab9531b-0bb9-4c09-812f-28f31cb4253e-installation-pull-secrets\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:33.541270 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:33.541219 2569 generic.go:358] "Generic (PLEG): container finished" podID="28c4d7cd-c381-4e90-8fc2-82cbb0c46b07" containerID="990b59372c0a7da7f5ef99ba10d5ef0a72bed8e125b52f0dbce11ab7738ffc54" exitCode=0 Apr 17 16:34:33.541867 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:33.541331 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-786846df7d-8q4xm" Apr 17 16:34:33.541867 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:33.541322 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07","Type":"ContainerDied","Data":"990b59372c0a7da7f5ef99ba10d5ef0a72bed8e125b52f0dbce11ab7738ffc54"} Apr 17 16:34:33.598629 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:33.598597 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-786846df7d-8q4xm"] Apr 17 16:34:33.602269 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:33.602241 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-786846df7d-8q4xm"] Apr 17 16:34:33.664486 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:33.664454 2569 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eab9531b-0bb9-4c09-812f-28f31cb4253e-registry-tls\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:33.772879 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:33.772845 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d5c757cd7-5cbvx"] Apr 17 16:34:34.920335 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:34.920299 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eab9531b-0bb9-4c09-812f-28f31cb4253e" path="/var/lib/kubelet/pods/eab9531b-0bb9-4c09-812f-28f31cb4253e/volumes" Apr 17 16:34:37.556714 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:37.556680 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07","Type":"ContainerStarted","Data":"40fe2a72cfa211a8a7a0dbf5ccbe000f740ae72b771a6b993dadad7bef19e062"} Apr 17 16:34:37.556714 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:37.556716 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07","Type":"ContainerStarted","Data":"023ee5e9a0fa4c3a9d5b0288145d0f9550e1b98e4ddd0ec7e5cd7b800f6ae87b"} Apr 17 16:34:40.568096 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:40.568056 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07","Type":"ContainerStarted","Data":"0ac44287524245f5c05db81a9e215e0e5e1d041f0a3c26af5ca0478358181129"} Apr 17 16:34:40.568096 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:40.568100 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07","Type":"ContainerStarted","Data":"1b6416c06472efbf913a2b5432008b77fbe7f2a814def5651da6fabbcb39ff49"} Apr 17 16:34:40.568539 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:40.568113 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07","Type":"ContainerStarted","Data":"0323598c6dbb49a9d855bc521fd6d107286d3efb1a98b2a4a691cc5683a25f85"} Apr 17 16:34:40.568539 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:40.568126 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28c4d7cd-c381-4e90-8fc2-82cbb0c46b07","Type":"ContainerStarted","Data":"a59f7644681f1c59e7219488e35154f0c036537150cf18c5e88afd7d7edff67c"} Apr 17 16:34:40.596900 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:40.596785 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.520700949 podStartE2EDuration="10.596763679s" podCreationTimestamp="2026-04-17 16:34:30 +0000 UTC" firstStartedPulling="2026-04-17 16:34:31.000369977 +0000 UTC m=+200.657177184" lastFinishedPulling="2026-04-17 16:34:40.076432725 +0000 UTC m=+209.733239914" observedRunningTime="2026-04-17 16:34:40.59492829 +0000 UTC m=+210.251735489" watchObservedRunningTime="2026-04-17 16:34:40.596763679 +0000 UTC m=+210.253570888" Apr 17 16:34:40.743496 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:40.743462 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:40.795951 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:40.795911 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:34:45.153295 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:45.153261 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79b4b9b57f-5rxbd"] Apr 17 16:34:48.594604 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:48.594569 2569 generic.go:358] "Generic (PLEG): container finished" podID="f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068" containerID="cc46971ddca344b7a29b3c6385cf177b09d51742d9745dbe033508de85b8b172" exitCode=0 Apr 17 16:34:48.595003 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:48.594641 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch" event={"ID":"f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068","Type":"ContainerDied","Data":"cc46971ddca344b7a29b3c6385cf177b09d51742d9745dbe033508de85b8b172"} Apr 17 16:34:48.595044 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:48.595003 2569 scope.go:117] "RemoveContainer" containerID="cc46971ddca344b7a29b3c6385cf177b09d51742d9745dbe033508de85b8b172" Apr 17 16:34:49.600997 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:49.600966 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zlzch" event={"ID":"f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068","Type":"ContainerStarted","Data":"9cc9409dbef4489aecfed4e4e7486aaa3653338046690b7cda3e1879d5911d42"} Apr 17 16:34:55.395676 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.395607 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-764f6684cd-shspq" podUID="7735eef9-d6d0-40ea-85e5-fa65b68a0dc6" containerName="console" containerID="cri-o://ddeb9a2a7717a0e1efb01bdcbf592afdbb0b51dd433f5ce0d53da02a587e8734" gracePeriod=15 Apr 17 16:34:55.618844 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.618820 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-764f6684cd-shspq_7735eef9-d6d0-40ea-85e5-fa65b68a0dc6/console/0.log" Apr 17 16:34:55.618964 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.618867 2569 generic.go:358] "Generic (PLEG): container finished" podID="7735eef9-d6d0-40ea-85e5-fa65b68a0dc6" containerID="ddeb9a2a7717a0e1efb01bdcbf592afdbb0b51dd433f5ce0d53da02a587e8734" exitCode=2 Apr 17 16:34:55.618964 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.618926 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764f6684cd-shspq" event={"ID":"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6","Type":"ContainerDied","Data":"ddeb9a2a7717a0e1efb01bdcbf592afdbb0b51dd433f5ce0d53da02a587e8734"} Apr 17 16:34:55.618964 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.618950 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764f6684cd-shspq" event={"ID":"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6","Type":"ContainerDied","Data":"91b5d5817dffd1edde8d9083b9cc4fa1d457c195d6bf58f77314fa07ca6d651b"} Apr 17 16:34:55.618964 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.618964 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91b5d5817dffd1edde8d9083b9cc4fa1d457c195d6bf58f77314fa07ca6d651b" Apr 17 16:34:55.631442 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.631421 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-764f6684cd-shspq_7735eef9-d6d0-40ea-85e5-fa65b68a0dc6/console/0.log" Apr 17 16:34:55.631560 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.631485 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:55.768720 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.768688 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-service-ca\") pod \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " Apr 17 16:34:55.768922 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.768750 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq5dj\" (UniqueName: \"kubernetes.io/projected/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-kube-api-access-nq5dj\") pod \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " Apr 17 16:34:55.768922 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.768770 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-console-config\") pod \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " Apr 17 16:34:55.768922 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.768789 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-console-serving-cert\") pod \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " Apr 17 16:34:55.768922 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.768862 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-oauth-serving-cert\") pod \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " Apr 17 16:34:55.769138 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.768925 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-console-oauth-config\") pod \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " Apr 17 16:34:55.769138 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.768966 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-trusted-ca-bundle\") pod \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\" (UID: \"7735eef9-d6d0-40ea-85e5-fa65b68a0dc6\") " Apr 17 16:34:55.769239 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.769148 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-service-ca" (OuterVolumeSpecName: "service-ca") pod "7735eef9-d6d0-40ea-85e5-fa65b68a0dc6" (UID: "7735eef9-d6d0-40ea-85e5-fa65b68a0dc6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:55.769239 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.769208 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-console-config" (OuterVolumeSpecName: "console-config") pod "7735eef9-d6d0-40ea-85e5-fa65b68a0dc6" (UID: "7735eef9-d6d0-40ea-85e5-fa65b68a0dc6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:55.769349 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.769247 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7735eef9-d6d0-40ea-85e5-fa65b68a0dc6" (UID: "7735eef9-d6d0-40ea-85e5-fa65b68a0dc6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:55.769349 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.769266 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-console-config\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:55.769349 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.769285 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-service-ca\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:55.769521 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.769504 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7735eef9-d6d0-40ea-85e5-fa65b68a0dc6" (UID: "7735eef9-d6d0-40ea-85e5-fa65b68a0dc6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:55.771208 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.771187 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7735eef9-d6d0-40ea-85e5-fa65b68a0dc6" (UID: "7735eef9-d6d0-40ea-85e5-fa65b68a0dc6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:55.771305 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.771208 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7735eef9-d6d0-40ea-85e5-fa65b68a0dc6" (UID: "7735eef9-d6d0-40ea-85e5-fa65b68a0dc6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:55.771305 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.771212 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-kube-api-access-nq5dj" (OuterVolumeSpecName: "kube-api-access-nq5dj") pod "7735eef9-d6d0-40ea-85e5-fa65b68a0dc6" (UID: "7735eef9-d6d0-40ea-85e5-fa65b68a0dc6"). InnerVolumeSpecName "kube-api-access-nq5dj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:55.870309 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.870274 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-console-oauth-config\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:55.870309 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.870303 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-trusted-ca-bundle\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:55.870309 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.870313 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nq5dj\" (UniqueName: \"kubernetes.io/projected/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-kube-api-access-nq5dj\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:55.870528 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.870322 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-console-serving-cert\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:55.870528 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:55.870332 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6-oauth-serving-cert\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:56.621496 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:56.621463 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764f6684cd-shspq" Apr 17 16:34:56.641930 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:56.641903 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-764f6684cd-shspq"] Apr 17 16:34:56.645606 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:56.645583 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-764f6684cd-shspq"] Apr 17 16:34:56.914150 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:56.914075 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7735eef9-d6d0-40ea-85e5-fa65b68a0dc6" path="/var/lib/kubelet/pods/7735eef9-d6d0-40ea-85e5-fa65b68a0dc6/volumes" Apr 17 16:34:58.794125 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:58.794084 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6d5c757cd7-5cbvx" podUID="087770ad-46ca-429b-9fb8-a24f6cdc1d69" containerName="console" containerID="cri-o://2805f1500d356bee21d798ebf8e035b751f68077434fd8e52fe13504842c9aee" gracePeriod=15 Apr 17 16:34:59.046585 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.046527 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d5c757cd7-5cbvx_087770ad-46ca-429b-9fb8-a24f6cdc1d69/console/0.log" Apr 17 16:34:59.046701 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.046588 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:59.200955 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.200925 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-trusted-ca-bundle\") pod \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " Apr 17 16:34:59.201124 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.200965 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-service-ca\") pod \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " Apr 17 16:34:59.201124 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.201002 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-console-config\") pod \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " Apr 17 16:34:59.201124 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.201025 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/087770ad-46ca-429b-9fb8-a24f6cdc1d69-console-serving-cert\") pod \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " Apr 17 16:34:59.201124 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.201055 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j69th\" (UniqueName: \"kubernetes.io/projected/087770ad-46ca-429b-9fb8-a24f6cdc1d69-kube-api-access-j69th\") pod \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " Apr 17 16:34:59.201124 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.201113 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-oauth-serving-cert\") pod \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " Apr 17 16:34:59.201395 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.201149 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/087770ad-46ca-429b-9fb8-a24f6cdc1d69-console-oauth-config\") pod \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\" (UID: \"087770ad-46ca-429b-9fb8-a24f6cdc1d69\") " Apr 17 16:34:59.201449 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.201401 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-console-config" (OuterVolumeSpecName: "console-config") pod "087770ad-46ca-429b-9fb8-a24f6cdc1d69" (UID: "087770ad-46ca-429b-9fb8-a24f6cdc1d69"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:59.201449 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.201418 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "087770ad-46ca-429b-9fb8-a24f6cdc1d69" (UID: "087770ad-46ca-429b-9fb8-a24f6cdc1d69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:59.201449 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.201407 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-service-ca" (OuterVolumeSpecName: "service-ca") pod "087770ad-46ca-429b-9fb8-a24f6cdc1d69" (UID: "087770ad-46ca-429b-9fb8-a24f6cdc1d69"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:59.201715 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.201697 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "087770ad-46ca-429b-9fb8-a24f6cdc1d69" (UID: "087770ad-46ca-429b-9fb8-a24f6cdc1d69"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:59.203359 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.203337 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/087770ad-46ca-429b-9fb8-a24f6cdc1d69-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "087770ad-46ca-429b-9fb8-a24f6cdc1d69" (UID: "087770ad-46ca-429b-9fb8-a24f6cdc1d69"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:59.203468 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.203449 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/087770ad-46ca-429b-9fb8-a24f6cdc1d69-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "087770ad-46ca-429b-9fb8-a24f6cdc1d69" (UID: "087770ad-46ca-429b-9fb8-a24f6cdc1d69"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:59.203520 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.203458 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/087770ad-46ca-429b-9fb8-a24f6cdc1d69-kube-api-access-j69th" (OuterVolumeSpecName: "kube-api-access-j69th") pod "087770ad-46ca-429b-9fb8-a24f6cdc1d69" (UID: "087770ad-46ca-429b-9fb8-a24f6cdc1d69"). InnerVolumeSpecName "kube-api-access-j69th". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:59.302734 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.302651 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j69th\" (UniqueName: \"kubernetes.io/projected/087770ad-46ca-429b-9fb8-a24f6cdc1d69-kube-api-access-j69th\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:59.302734 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.302682 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-oauth-serving-cert\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:59.302734 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.302691 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/087770ad-46ca-429b-9fb8-a24f6cdc1d69-console-oauth-config\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:59.302734 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.302699 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-trusted-ca-bundle\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:59.302734 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.302709 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-service-ca\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:59.302734 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.302718 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/087770ad-46ca-429b-9fb8-a24f6cdc1d69-console-config\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:59.302734 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.302726 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/087770ad-46ca-429b-9fb8-a24f6cdc1d69-console-serving-cert\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:34:59.630950 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.630876 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d5c757cd7-5cbvx_087770ad-46ca-429b-9fb8-a24f6cdc1d69/console/0.log" Apr 17 16:34:59.630950 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.630916 2569 generic.go:358] "Generic (PLEG): container finished" podID="087770ad-46ca-429b-9fb8-a24f6cdc1d69" containerID="2805f1500d356bee21d798ebf8e035b751f68077434fd8e52fe13504842c9aee" exitCode=2 Apr 17 16:34:59.631124 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.630954 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d5c757cd7-5cbvx" event={"ID":"087770ad-46ca-429b-9fb8-a24f6cdc1d69","Type":"ContainerDied","Data":"2805f1500d356bee21d798ebf8e035b751f68077434fd8e52fe13504842c9aee"} Apr 17 16:34:59.631124 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.630994 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d5c757cd7-5cbvx" event={"ID":"087770ad-46ca-429b-9fb8-a24f6cdc1d69","Type":"ContainerDied","Data":"00c8d1119273036377e276f23f165fcc6e715b1b40a20e6b7d62c14068c19cca"} Apr 17 16:34:59.631124 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.631009 2569 scope.go:117] "RemoveContainer" containerID="2805f1500d356bee21d798ebf8e035b751f68077434fd8e52fe13504842c9aee" Apr 17 16:34:59.631124 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.631019 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d5c757cd7-5cbvx" Apr 17 16:34:59.639263 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.639243 2569 scope.go:117] "RemoveContainer" containerID="2805f1500d356bee21d798ebf8e035b751f68077434fd8e52fe13504842c9aee" Apr 17 16:34:59.639526 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:34:59.639507 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2805f1500d356bee21d798ebf8e035b751f68077434fd8e52fe13504842c9aee\": container with ID starting with 2805f1500d356bee21d798ebf8e035b751f68077434fd8e52fe13504842c9aee not found: ID does not exist" containerID="2805f1500d356bee21d798ebf8e035b751f68077434fd8e52fe13504842c9aee" Apr 17 16:34:59.639576 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.639534 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2805f1500d356bee21d798ebf8e035b751f68077434fd8e52fe13504842c9aee"} err="failed to get container status \"2805f1500d356bee21d798ebf8e035b751f68077434fd8e52fe13504842c9aee\": rpc error: code = NotFound desc = could not find container \"2805f1500d356bee21d798ebf8e035b751f68077434fd8e52fe13504842c9aee\": container with ID starting with 2805f1500d356bee21d798ebf8e035b751f68077434fd8e52fe13504842c9aee not found: ID does not exist" Apr 17 16:34:59.651783 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.651754 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d5c757cd7-5cbvx"] Apr 17 16:34:59.654998 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:34:59.654976 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6d5c757cd7-5cbvx"] Apr 17 16:35:00.913815 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:00.913761 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="087770ad-46ca-429b-9fb8-a24f6cdc1d69" path="/var/lib/kubelet/pods/087770ad-46ca-429b-9fb8-a24f6cdc1d69/volumes" Apr 17 16:35:10.172442 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.172378 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-79b4b9b57f-5rxbd" podUID="11ad1a17-2e5f-492c-be31-afef2b0e6a94" containerName="console" containerID="cri-o://ad4156c2d33973212198b9b7e3d5358976b17408a08ef632c358e7e03153794d" gracePeriod=15 Apr 17 16:35:10.441268 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.441244 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79b4b9b57f-5rxbd_11ad1a17-2e5f-492c-be31-afef2b0e6a94/console/0.log" Apr 17 16:35:10.441384 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.441307 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:35:10.601788 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.601756 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/11ad1a17-2e5f-492c-be31-afef2b0e6a94-oauth-serving-cert\") pod \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " Apr 17 16:35:10.601988 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.601794 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/11ad1a17-2e5f-492c-be31-afef2b0e6a94-console-config\") pod \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " Apr 17 16:35:10.601988 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.601847 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/11ad1a17-2e5f-492c-be31-afef2b0e6a94-console-oauth-config\") pod \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " Apr 17 16:35:10.601988 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.601922 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/11ad1a17-2e5f-492c-be31-afef2b0e6a94-console-serving-cert\") pod \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " Apr 17 16:35:10.601988 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.601948 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/11ad1a17-2e5f-492c-be31-afef2b0e6a94-service-ca\") pod \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " Apr 17 16:35:10.601988 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.601988 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv7pn\" (UniqueName: \"kubernetes.io/projected/11ad1a17-2e5f-492c-be31-afef2b0e6a94-kube-api-access-kv7pn\") pod \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\" (UID: \"11ad1a17-2e5f-492c-be31-afef2b0e6a94\") " Apr 17 16:35:10.602338 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.602302 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11ad1a17-2e5f-492c-be31-afef2b0e6a94-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "11ad1a17-2e5f-492c-be31-afef2b0e6a94" (UID: "11ad1a17-2e5f-492c-be31-afef2b0e6a94"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:35:10.602338 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.602334 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11ad1a17-2e5f-492c-be31-afef2b0e6a94-console-config" (OuterVolumeSpecName: "console-config") pod "11ad1a17-2e5f-492c-be31-afef2b0e6a94" (UID: "11ad1a17-2e5f-492c-be31-afef2b0e6a94"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:35:10.602520 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.602432 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11ad1a17-2e5f-492c-be31-afef2b0e6a94-service-ca" (OuterVolumeSpecName: "service-ca") pod "11ad1a17-2e5f-492c-be31-afef2b0e6a94" (UID: "11ad1a17-2e5f-492c-be31-afef2b0e6a94"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:35:10.604208 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.604182 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11ad1a17-2e5f-492c-be31-afef2b0e6a94-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "11ad1a17-2e5f-492c-be31-afef2b0e6a94" (UID: "11ad1a17-2e5f-492c-be31-afef2b0e6a94"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:10.604300 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.604187 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11ad1a17-2e5f-492c-be31-afef2b0e6a94-kube-api-access-kv7pn" (OuterVolumeSpecName: "kube-api-access-kv7pn") pod "11ad1a17-2e5f-492c-be31-afef2b0e6a94" (UID: "11ad1a17-2e5f-492c-be31-afef2b0e6a94"). InnerVolumeSpecName "kube-api-access-kv7pn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:35:10.604354 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.604292 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11ad1a17-2e5f-492c-be31-afef2b0e6a94-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "11ad1a17-2e5f-492c-be31-afef2b0e6a94" (UID: "11ad1a17-2e5f-492c-be31-afef2b0e6a94"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:10.662381 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.662355 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79b4b9b57f-5rxbd_11ad1a17-2e5f-492c-be31-afef2b0e6a94/console/0.log" Apr 17 16:35:10.662563 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.662392 2569 generic.go:358] "Generic (PLEG): container finished" podID="11ad1a17-2e5f-492c-be31-afef2b0e6a94" containerID="ad4156c2d33973212198b9b7e3d5358976b17408a08ef632c358e7e03153794d" exitCode=2 Apr 17 16:35:10.662563 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.662428 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79b4b9b57f-5rxbd" event={"ID":"11ad1a17-2e5f-492c-be31-afef2b0e6a94","Type":"ContainerDied","Data":"ad4156c2d33973212198b9b7e3d5358976b17408a08ef632c358e7e03153794d"} Apr 17 16:35:10.662563 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.662456 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79b4b9b57f-5rxbd" Apr 17 16:35:10.662563 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.662473 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79b4b9b57f-5rxbd" event={"ID":"11ad1a17-2e5f-492c-be31-afef2b0e6a94","Type":"ContainerDied","Data":"eb40b093d04ba7d19fad78b5e041e55b00cb028a7217dd7c764a7d41280589d7"} Apr 17 16:35:10.662563 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.662494 2569 scope.go:117] "RemoveContainer" containerID="ad4156c2d33973212198b9b7e3d5358976b17408a08ef632c358e7e03153794d" Apr 17 16:35:10.671244 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.671227 2569 scope.go:117] "RemoveContainer" containerID="ad4156c2d33973212198b9b7e3d5358976b17408a08ef632c358e7e03153794d" Apr 17 16:35:10.671492 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:35:10.671475 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad4156c2d33973212198b9b7e3d5358976b17408a08ef632c358e7e03153794d\": container with ID starting with ad4156c2d33973212198b9b7e3d5358976b17408a08ef632c358e7e03153794d not found: ID does not exist" containerID="ad4156c2d33973212198b9b7e3d5358976b17408a08ef632c358e7e03153794d" Apr 17 16:35:10.671545 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.671500 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad4156c2d33973212198b9b7e3d5358976b17408a08ef632c358e7e03153794d"} err="failed to get container status \"ad4156c2d33973212198b9b7e3d5358976b17408a08ef632c358e7e03153794d\": rpc error: code = NotFound desc = could not find container \"ad4156c2d33973212198b9b7e3d5358976b17408a08ef632c358e7e03153794d\": container with ID starting with ad4156c2d33973212198b9b7e3d5358976b17408a08ef632c358e7e03153794d not found: ID does not exist" Apr 17 16:35:10.683312 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.683256 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79b4b9b57f-5rxbd"] Apr 17 16:35:10.687300 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.687277 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-79b4b9b57f-5rxbd"] Apr 17 16:35:10.703585 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.703552 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/11ad1a17-2e5f-492c-be31-afef2b0e6a94-console-oauth-config\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:35:10.703585 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.703583 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/11ad1a17-2e5f-492c-be31-afef2b0e6a94-console-serving-cert\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:35:10.703718 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.703595 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/11ad1a17-2e5f-492c-be31-afef2b0e6a94-service-ca\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:35:10.703718 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.703604 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kv7pn\" (UniqueName: \"kubernetes.io/projected/11ad1a17-2e5f-492c-be31-afef2b0e6a94-kube-api-access-kv7pn\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:35:10.703718 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.703614 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/11ad1a17-2e5f-492c-be31-afef2b0e6a94-oauth-serving-cert\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:35:10.703718 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.703622 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/11ad1a17-2e5f-492c-be31-afef2b0e6a94-console-config\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:35:10.914517 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:10.914486 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11ad1a17-2e5f-492c-be31-afef2b0e6a94" path="/var/lib/kubelet/pods/11ad1a17-2e5f-492c-be31-afef2b0e6a94/volumes" Apr 17 16:35:22.808380 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:22.808325 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs\") pod \"network-metrics-daemon-njmgk\" (UID: \"79adaa92-9fae-4abb-b9ae-335440dbe8f1\") " pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:35:22.811584 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:22.811550 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79adaa92-9fae-4abb-b9ae-335440dbe8f1-metrics-certs\") pod \"network-metrics-daemon-njmgk\" (UID: \"79adaa92-9fae-4abb-b9ae-335440dbe8f1\") " pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:35:22.915346 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:22.915320 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bq7nv\"" Apr 17 16:35:22.923482 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:22.923453 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-njmgk" Apr 17 16:35:23.083892 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:23.078527 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-njmgk"] Apr 17 16:35:23.088709 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:35:23.088680 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79adaa92_9fae_4abb_b9ae_335440dbe8f1.slice/crio-ff5f0dcf9684250b54cda8135d9b15756921002a98b32a262c6aaee0466de53f WatchSource:0}: Error finding container ff5f0dcf9684250b54cda8135d9b15756921002a98b32a262c6aaee0466de53f: Status 404 returned error can't find the container with id ff5f0dcf9684250b54cda8135d9b15756921002a98b32a262c6aaee0466de53f Apr 17 16:35:23.701108 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:23.701044 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-njmgk" event={"ID":"79adaa92-9fae-4abb-b9ae-335440dbe8f1","Type":"ContainerStarted","Data":"ff5f0dcf9684250b54cda8135d9b15756921002a98b32a262c6aaee0466de53f"} Apr 17 16:35:24.705196 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:24.705157 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-njmgk" event={"ID":"79adaa92-9fae-4abb-b9ae-335440dbe8f1","Type":"ContainerStarted","Data":"1b2ea2c86485c414b8a005cb0c1e932f188e9c6e3877d5224b4d374b4daf72f4"} Apr 17 16:35:24.705196 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:24.705199 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-njmgk" event={"ID":"79adaa92-9fae-4abb-b9ae-335440dbe8f1","Type":"ContainerStarted","Data":"2ee310f5fd6a7b7518d435e7b6d9aa97b6f632d058594eb1aafa6a1c4c65b1a6"} Apr 17 16:35:24.722352 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:24.722297 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-njmgk" podStartSLOduration=252.630765987 podStartE2EDuration="4m13.722278748s" podCreationTimestamp="2026-04-17 16:31:11 +0000 UTC" firstStartedPulling="2026-04-17 16:35:23.090609333 +0000 UTC m=+252.747416519" lastFinishedPulling="2026-04-17 16:35:24.182122089 +0000 UTC m=+253.838929280" observedRunningTime="2026-04-17 16:35:24.721326182 +0000 UTC m=+254.378133391" watchObservedRunningTime="2026-04-17 16:35:24.722278748 +0000 UTC m=+254.379085958" Apr 17 16:35:30.796395 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:30.796295 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:30.815941 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:30.815904 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:31.742186 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:31.742160 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:35:50.359255 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:35:50.359202 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-7bhrv" podUID="db105214-70a5-4d57-b705-6d896bd0f8a3" Apr 17 16:35:50.359255 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:35:50.359232 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-scnk5" podUID="97268ee4-4d26-4550-8a7e-78ee5bbc45c0" Apr 17 16:35:50.780023 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:50.779992 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-scnk5" Apr 17 16:35:50.780192 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:50.779992 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:35:53.775504 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:53.775445 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert\") pod \"ingress-canary-7bhrv\" (UID: \"db105214-70a5-4d57-b705-6d896bd0f8a3\") " pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:35:53.777890 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:53.777865 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db105214-70a5-4d57-b705-6d896bd0f8a3-cert\") pod \"ingress-canary-7bhrv\" (UID: \"db105214-70a5-4d57-b705-6d896bd0f8a3\") " pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:35:53.783657 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:53.783636 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-txqpq\"" Apr 17 16:35:53.791399 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:53.791379 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7bhrv" Apr 17 16:35:53.876472 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:53.876438 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:35:53.880451 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:53.880423 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/97268ee4-4d26-4550-8a7e-78ee5bbc45c0-metrics-tls\") pod \"dns-default-scnk5\" (UID: \"97268ee4-4d26-4550-8a7e-78ee5bbc45c0\") " pod="openshift-dns/dns-default-scnk5" Apr 17 16:35:53.908682 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:53.908655 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7bhrv"] Apr 17 16:35:53.911280 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:35:53.911248 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb105214_70a5_4d57_b705_6d896bd0f8a3.slice/crio-7f536280ecc846490e9c199db6fffa24beba731c86dcd2bd522b29e23d7c222b WatchSource:0}: Error finding container 7f536280ecc846490e9c199db6fffa24beba731c86dcd2bd522b29e23d7c222b: Status 404 returned error can't find the container with id 7f536280ecc846490e9c199db6fffa24beba731c86dcd2bd522b29e23d7c222b Apr 17 16:35:54.083928 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:54.083836 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-f96dz\"" Apr 17 16:35:54.092041 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:54.092018 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-scnk5" Apr 17 16:35:54.211869 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:54.211847 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-scnk5"] Apr 17 16:35:54.213934 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:35:54.213909 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97268ee4_4d26_4550_8a7e_78ee5bbc45c0.slice/crio-fc1ccae8b1e53ebd067f5a9cb0d6d20a892aee56658fbf21121c7c388fdaf924 WatchSource:0}: Error finding container fc1ccae8b1e53ebd067f5a9cb0d6d20a892aee56658fbf21121c7c388fdaf924: Status 404 returned error can't find the container with id fc1ccae8b1e53ebd067f5a9cb0d6d20a892aee56658fbf21121c7c388fdaf924 Apr 17 16:35:54.793188 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:54.793146 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-scnk5" event={"ID":"97268ee4-4d26-4550-8a7e-78ee5bbc45c0","Type":"ContainerStarted","Data":"fc1ccae8b1e53ebd067f5a9cb0d6d20a892aee56658fbf21121c7c388fdaf924"} Apr 17 16:35:54.794377 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:54.794351 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7bhrv" event={"ID":"db105214-70a5-4d57-b705-6d896bd0f8a3","Type":"ContainerStarted","Data":"7f536280ecc846490e9c199db6fffa24beba731c86dcd2bd522b29e23d7c222b"} Apr 17 16:35:56.800861 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:56.800823 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-scnk5" event={"ID":"97268ee4-4d26-4550-8a7e-78ee5bbc45c0","Type":"ContainerStarted","Data":"c74f9e63c2c9397e1574a78c04b839b99bc127499cc6b21fd128c8ff95d1a5eb"} Apr 17 16:35:56.800861 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:56.800863 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-scnk5" event={"ID":"97268ee4-4d26-4550-8a7e-78ee5bbc45c0","Type":"ContainerStarted","Data":"0fa353c6e122620f1e82b36acf09ef867071ec329d934cc1567f8990635113fc"} Apr 17 16:35:56.801308 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:56.801015 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-scnk5" Apr 17 16:35:56.802032 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:56.802012 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7bhrv" event={"ID":"db105214-70a5-4d57-b705-6d896bd0f8a3","Type":"ContainerStarted","Data":"e8c468568718a7371d08554372f2feb0b18b5d0aabd5025468040467ae375989"} Apr 17 16:35:56.817171 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:56.817118 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-scnk5" podStartSLOduration=251.927456613 podStartE2EDuration="4m13.81710191s" podCreationTimestamp="2026-04-17 16:31:43 +0000 UTC" firstStartedPulling="2026-04-17 16:35:54.215725688 +0000 UTC m=+283.872532873" lastFinishedPulling="2026-04-17 16:35:56.105370984 +0000 UTC m=+285.762178170" observedRunningTime="2026-04-17 16:35:56.816479808 +0000 UTC m=+286.473287017" watchObservedRunningTime="2026-04-17 16:35:56.81710191 +0000 UTC m=+286.473909119" Apr 17 16:35:56.829873 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:35:56.829692 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7bhrv" podStartSLOduration=251.639562106 podStartE2EDuration="4m13.829677882s" podCreationTimestamp="2026-04-17 16:31:43 +0000 UTC" firstStartedPulling="2026-04-17 16:35:53.913041907 +0000 UTC m=+283.569849104" lastFinishedPulling="2026-04-17 16:35:56.103157693 +0000 UTC m=+285.759964880" observedRunningTime="2026-04-17 16:35:56.829402328 +0000 UTC m=+286.486209552" watchObservedRunningTime="2026-04-17 16:35:56.829677882 +0000 UTC m=+286.486485090" Apr 17 16:36:06.808064 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:36:06.808032 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-scnk5" Apr 17 16:36:10.834226 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:36:10.834198 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 16:36:10.834665 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:36:10.834313 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 16:36:10.837309 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:36:10.837290 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 16:41:09.151181 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.151147 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-9f6kf"] Apr 17 16:41:09.151588 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.151435 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="087770ad-46ca-429b-9fb8-a24f6cdc1d69" containerName="console" Apr 17 16:41:09.151588 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.151447 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="087770ad-46ca-429b-9fb8-a24f6cdc1d69" containerName="console" Apr 17 16:41:09.151588 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.151455 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7735eef9-d6d0-40ea-85e5-fa65b68a0dc6" containerName="console" Apr 17 16:41:09.151588 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.151461 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7735eef9-d6d0-40ea-85e5-fa65b68a0dc6" containerName="console" Apr 17 16:41:09.151588 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.151478 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11ad1a17-2e5f-492c-be31-afef2b0e6a94" containerName="console" Apr 17 16:41:09.151588 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.151484 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ad1a17-2e5f-492c-be31-afef2b0e6a94" containerName="console" Apr 17 16:41:09.151588 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.151529 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="087770ad-46ca-429b-9fb8-a24f6cdc1d69" containerName="console" Apr 17 16:41:09.151588 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.151536 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7735eef9-d6d0-40ea-85e5-fa65b68a0dc6" containerName="console" Apr 17 16:41:09.151588 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.151544 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="11ad1a17-2e5f-492c-be31-afef2b0e6a94" containerName="console" Apr 17 16:41:09.154238 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.154222 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-9f6kf" Apr 17 16:41:09.158078 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.158052 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 17 16:41:09.158078 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.158086 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 16:41:09.158278 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.158185 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-vsxpp\"" Apr 17 16:41:09.158707 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.158690 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 16:41:09.167064 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.167044 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-9f6kf"] Apr 17 16:41:09.279523 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.279490 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/94324528-2d75-4d3c-9ada-a1a268e2ace9-tls-certs\") pod \"model-serving-api-86f7b4b499-9f6kf\" (UID: \"94324528-2d75-4d3c-9ada-a1a268e2ace9\") " pod="kserve/model-serving-api-86f7b4b499-9f6kf" Apr 17 16:41:09.279701 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.279547 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf87g\" (UniqueName: \"kubernetes.io/projected/94324528-2d75-4d3c-9ada-a1a268e2ace9-kube-api-access-pf87g\") pod \"model-serving-api-86f7b4b499-9f6kf\" (UID: \"94324528-2d75-4d3c-9ada-a1a268e2ace9\") " pod="kserve/model-serving-api-86f7b4b499-9f6kf" Apr 17 16:41:09.380905 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.380865 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pf87g\" (UniqueName: \"kubernetes.io/projected/94324528-2d75-4d3c-9ada-a1a268e2ace9-kube-api-access-pf87g\") pod \"model-serving-api-86f7b4b499-9f6kf\" (UID: \"94324528-2d75-4d3c-9ada-a1a268e2ace9\") " pod="kserve/model-serving-api-86f7b4b499-9f6kf" Apr 17 16:41:09.381061 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.380926 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/94324528-2d75-4d3c-9ada-a1a268e2ace9-tls-certs\") pod \"model-serving-api-86f7b4b499-9f6kf\" (UID: \"94324528-2d75-4d3c-9ada-a1a268e2ace9\") " pod="kserve/model-serving-api-86f7b4b499-9f6kf" Apr 17 16:41:09.381061 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:41:09.381031 2569 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 17 16:41:09.381135 ip-10-0-136-182 kubenswrapper[2569]: E0417 16:41:09.381095 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94324528-2d75-4d3c-9ada-a1a268e2ace9-tls-certs podName:94324528-2d75-4d3c-9ada-a1a268e2ace9 nodeName:}" failed. No retries permitted until 2026-04-17 16:41:09.881075816 +0000 UTC m=+599.537883010 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/94324528-2d75-4d3c-9ada-a1a268e2ace9-tls-certs") pod "model-serving-api-86f7b4b499-9f6kf" (UID: "94324528-2d75-4d3c-9ada-a1a268e2ace9") : secret "model-serving-api-tls" not found Apr 17 16:41:09.391122 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.391099 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf87g\" (UniqueName: \"kubernetes.io/projected/94324528-2d75-4d3c-9ada-a1a268e2ace9-kube-api-access-pf87g\") pod \"model-serving-api-86f7b4b499-9f6kf\" (UID: \"94324528-2d75-4d3c-9ada-a1a268e2ace9\") " pod="kserve/model-serving-api-86f7b4b499-9f6kf" Apr 17 16:41:09.886175 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.886135 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/94324528-2d75-4d3c-9ada-a1a268e2ace9-tls-certs\") pod \"model-serving-api-86f7b4b499-9f6kf\" (UID: \"94324528-2d75-4d3c-9ada-a1a268e2ace9\") " pod="kserve/model-serving-api-86f7b4b499-9f6kf" Apr 17 16:41:09.888614 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:09.888585 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/94324528-2d75-4d3c-9ada-a1a268e2ace9-tls-certs\") pod \"model-serving-api-86f7b4b499-9f6kf\" (UID: \"94324528-2d75-4d3c-9ada-a1a268e2ace9\") " pod="kserve/model-serving-api-86f7b4b499-9f6kf" Apr 17 16:41:10.064754 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:10.064721 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-9f6kf" Apr 17 16:41:10.226145 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:10.226117 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-9f6kf"] Apr 17 16:41:10.228123 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:41:10.228096 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94324528_2d75_4d3c_9ada_a1a268e2ace9.slice/crio-0b0463f617a85a07d650938a6dc43658a6929604c15030c9b35bf0a701b9df11 WatchSource:0}: Error finding container 0b0463f617a85a07d650938a6dc43658a6929604c15030c9b35bf0a701b9df11: Status 404 returned error can't find the container with id 0b0463f617a85a07d650938a6dc43658a6929604c15030c9b35bf0a701b9df11 Apr 17 16:41:10.229870 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:10.229856 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:41:10.697538 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:10.697507 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-9f6kf" event={"ID":"94324528-2d75-4d3c-9ada-a1a268e2ace9","Type":"ContainerStarted","Data":"0b0463f617a85a07d650938a6dc43658a6929604c15030c9b35bf0a701b9df11"} Apr 17 16:41:10.855563 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:10.855530 2569 scope.go:117] "RemoveContainer" containerID="ddeb9a2a7717a0e1efb01bdcbf592afdbb0b51dd433f5ce0d53da02a587e8734" Apr 17 16:41:10.871613 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:10.871585 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 16:41:10.875497 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:10.875469 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 16:41:12.705978 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:12.705942 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-9f6kf" event={"ID":"94324528-2d75-4d3c-9ada-a1a268e2ace9","Type":"ContainerStarted","Data":"85abfa1cbdc7eb1dac5a5cace1bd887fe060446904a72d8971327204dc92c857"} Apr 17 16:41:12.706379 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:12.706065 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-9f6kf" Apr 17 16:41:12.722347 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:12.722300 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-9f6kf" podStartSLOduration=1.395402037 podStartE2EDuration="3.722285612s" podCreationTimestamp="2026-04-17 16:41:09 +0000 UTC" firstStartedPulling="2026-04-17 16:41:10.229974881 +0000 UTC m=+599.886782068" lastFinishedPulling="2026-04-17 16:41:12.556858453 +0000 UTC m=+602.213665643" observedRunningTime="2026-04-17 16:41:12.721045274 +0000 UTC m=+602.377852504" watchObservedRunningTime="2026-04-17 16:41:12.722285612 +0000 UTC m=+602.379092820" Apr 17 16:41:23.713576 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:23.713549 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-9f6kf" Apr 17 16:41:25.132824 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:25.132736 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-6hnc5"] Apr 17 16:41:25.136126 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:25.136110 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6hnc5" Apr 17 16:41:25.138576 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:25.138551 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 16:41:25.138576 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:25.138571 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-mdnc6\"" Apr 17 16:41:25.142888 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:25.142866 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-6hnc5"] Apr 17 16:41:25.219417 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:25.219381 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96hzj\" (UniqueName: \"kubernetes.io/projected/22de6498-3617-4d8a-b5a6-261cb73d92b4-kube-api-access-96hzj\") pod \"s3-init-6hnc5\" (UID: \"22de6498-3617-4d8a-b5a6-261cb73d92b4\") " pod="kserve/s3-init-6hnc5" Apr 17 16:41:25.320165 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:25.320131 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96hzj\" (UniqueName: \"kubernetes.io/projected/22de6498-3617-4d8a-b5a6-261cb73d92b4-kube-api-access-96hzj\") pod \"s3-init-6hnc5\" (UID: \"22de6498-3617-4d8a-b5a6-261cb73d92b4\") " pod="kserve/s3-init-6hnc5" Apr 17 16:41:25.328173 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:25.328148 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96hzj\" (UniqueName: \"kubernetes.io/projected/22de6498-3617-4d8a-b5a6-261cb73d92b4-kube-api-access-96hzj\") pod \"s3-init-6hnc5\" (UID: \"22de6498-3617-4d8a-b5a6-261cb73d92b4\") " pod="kserve/s3-init-6hnc5" Apr 17 16:41:25.457487 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:25.457410 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6hnc5" Apr 17 16:41:25.576336 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:25.576302 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-6hnc5"] Apr 17 16:41:25.579300 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:41:25.579262 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22de6498_3617_4d8a_b5a6_261cb73d92b4.slice/crio-526d892a47932a3f2e2954bbcfb674d24de25ac7162f789c78d1120171623406 WatchSource:0}: Error finding container 526d892a47932a3f2e2954bbcfb674d24de25ac7162f789c78d1120171623406: Status 404 returned error can't find the container with id 526d892a47932a3f2e2954bbcfb674d24de25ac7162f789c78d1120171623406 Apr 17 16:41:25.741084 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:25.741001 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6hnc5" event={"ID":"22de6498-3617-4d8a-b5a6-261cb73d92b4","Type":"ContainerStarted","Data":"526d892a47932a3f2e2954bbcfb674d24de25ac7162f789c78d1120171623406"} Apr 17 16:41:30.758227 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:30.758191 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6hnc5" event={"ID":"22de6498-3617-4d8a-b5a6-261cb73d92b4","Type":"ContainerStarted","Data":"69f9e23aac9338e77e9f542cbd99ea2f1b1997e487a538176ef28714695e2079"} Apr 17 16:41:30.774868 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:30.774796 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-6hnc5" podStartSLOduration=1.315926596 podStartE2EDuration="5.774780273s" podCreationTimestamp="2026-04-17 16:41:25 +0000 UTC" firstStartedPulling="2026-04-17 16:41:25.581139033 +0000 UTC m=+615.237946220" lastFinishedPulling="2026-04-17 16:41:30.039992711 +0000 UTC m=+619.696799897" observedRunningTime="2026-04-17 16:41:30.772245968 +0000 UTC m=+620.429053187" watchObservedRunningTime="2026-04-17 16:41:30.774780273 +0000 UTC m=+620.431587481" Apr 17 16:41:33.768366 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:33.768330 2569 generic.go:358] "Generic (PLEG): container finished" podID="22de6498-3617-4d8a-b5a6-261cb73d92b4" containerID="69f9e23aac9338e77e9f542cbd99ea2f1b1997e487a538176ef28714695e2079" exitCode=0 Apr 17 16:41:33.768366 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:33.768372 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6hnc5" event={"ID":"22de6498-3617-4d8a-b5a6-261cb73d92b4","Type":"ContainerDied","Data":"69f9e23aac9338e77e9f542cbd99ea2f1b1997e487a538176ef28714695e2079"} Apr 17 16:41:34.895303 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:34.895281 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6hnc5" Apr 17 16:41:35.002486 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:35.002451 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96hzj\" (UniqueName: \"kubernetes.io/projected/22de6498-3617-4d8a-b5a6-261cb73d92b4-kube-api-access-96hzj\") pod \"22de6498-3617-4d8a-b5a6-261cb73d92b4\" (UID: \"22de6498-3617-4d8a-b5a6-261cb73d92b4\") " Apr 17 16:41:35.004723 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:35.004695 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22de6498-3617-4d8a-b5a6-261cb73d92b4-kube-api-access-96hzj" (OuterVolumeSpecName: "kube-api-access-96hzj") pod "22de6498-3617-4d8a-b5a6-261cb73d92b4" (UID: "22de6498-3617-4d8a-b5a6-261cb73d92b4"). InnerVolumeSpecName "kube-api-access-96hzj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:41:35.103208 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:35.103123 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-96hzj\" (UniqueName: \"kubernetes.io/projected/22de6498-3617-4d8a-b5a6-261cb73d92b4-kube-api-access-96hzj\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:41:35.774319 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:35.774293 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6hnc5" Apr 17 16:41:35.774490 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:35.774323 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6hnc5" event={"ID":"22de6498-3617-4d8a-b5a6-261cb73d92b4","Type":"ContainerDied","Data":"526d892a47932a3f2e2954bbcfb674d24de25ac7162f789c78d1120171623406"} Apr 17 16:41:35.774490 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:35.774350 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="526d892a47932a3f2e2954bbcfb674d24de25ac7162f789c78d1120171623406" Apr 17 16:41:46.162876 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:46.162835 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-dm2gb"] Apr 17 16:41:46.163238 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:46.163137 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22de6498-3617-4d8a-b5a6-261cb73d92b4" containerName="s3-init" Apr 17 16:41:46.163238 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:46.163148 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="22de6498-3617-4d8a-b5a6-261cb73d92b4" containerName="s3-init" Apr 17 16:41:46.163238 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:46.163213 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="22de6498-3617-4d8a-b5a6-261cb73d92b4" containerName="s3-init" Apr 17 16:41:46.166480 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:46.166462 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-dm2gb" Apr 17 16:41:46.168954 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:46.168932 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 17 16:41:46.169058 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:46.168933 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-mdnc6\"" Apr 17 16:41:46.172992 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:46.172968 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-dm2gb"] Apr 17 16:41:46.177663 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:46.177633 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wdt2\" (UniqueName: \"kubernetes.io/projected/6fa75649-527c-47a4-8593-5e0d2a41c02b-kube-api-access-9wdt2\") pod \"s3-tls-init-custom-dm2gb\" (UID: \"6fa75649-527c-47a4-8593-5e0d2a41c02b\") " pod="kserve/s3-tls-init-custom-dm2gb" Apr 17 16:41:46.278842 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:46.278774 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wdt2\" (UniqueName: \"kubernetes.io/projected/6fa75649-527c-47a4-8593-5e0d2a41c02b-kube-api-access-9wdt2\") pod \"s3-tls-init-custom-dm2gb\" (UID: \"6fa75649-527c-47a4-8593-5e0d2a41c02b\") " pod="kserve/s3-tls-init-custom-dm2gb" Apr 17 16:41:46.287864 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:46.287833 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wdt2\" (UniqueName: \"kubernetes.io/projected/6fa75649-527c-47a4-8593-5e0d2a41c02b-kube-api-access-9wdt2\") pod \"s3-tls-init-custom-dm2gb\" (UID: \"6fa75649-527c-47a4-8593-5e0d2a41c02b\") " pod="kserve/s3-tls-init-custom-dm2gb" Apr 17 16:41:46.487942 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:46.487862 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-dm2gb" Apr 17 16:41:46.609919 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:46.609882 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-dm2gb"] Apr 17 16:41:46.612685 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:41:46.612657 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fa75649_527c_47a4_8593_5e0d2a41c02b.slice/crio-950e31bbd716e8fa58ac972b603543cf997a27b4948d5b7a371d62085848497f WatchSource:0}: Error finding container 950e31bbd716e8fa58ac972b603543cf997a27b4948d5b7a371d62085848497f: Status 404 returned error can't find the container with id 950e31bbd716e8fa58ac972b603543cf997a27b4948d5b7a371d62085848497f Apr 17 16:41:46.807254 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:46.807216 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-dm2gb" event={"ID":"6fa75649-527c-47a4-8593-5e0d2a41c02b","Type":"ContainerStarted","Data":"49ccb5deab64370f381daeaaf095319726a2262eefe0df6d339fb5da42ef13fb"} Apr 17 16:41:46.807254 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:46.807252 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-dm2gb" event={"ID":"6fa75649-527c-47a4-8593-5e0d2a41c02b","Type":"ContainerStarted","Data":"950e31bbd716e8fa58ac972b603543cf997a27b4948d5b7a371d62085848497f"} Apr 17 16:41:46.822494 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:46.822443 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-dm2gb" podStartSLOduration=0.822423742 podStartE2EDuration="822.423742ms" podCreationTimestamp="2026-04-17 16:41:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:41:46.822007173 +0000 UTC m=+636.478814396" watchObservedRunningTime="2026-04-17 16:41:46.822423742 +0000 UTC m=+636.479230951" Apr 17 16:41:51.822014 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:51.821983 2569 generic.go:358] "Generic (PLEG): container finished" podID="6fa75649-527c-47a4-8593-5e0d2a41c02b" containerID="49ccb5deab64370f381daeaaf095319726a2262eefe0df6d339fb5da42ef13fb" exitCode=0 Apr 17 16:41:51.822429 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:51.822063 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-dm2gb" event={"ID":"6fa75649-527c-47a4-8593-5e0d2a41c02b","Type":"ContainerDied","Data":"49ccb5deab64370f381daeaaf095319726a2262eefe0df6d339fb5da42ef13fb"} Apr 17 16:41:52.950060 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:52.950036 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-dm2gb" Apr 17 16:41:53.023709 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:53.023677 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wdt2\" (UniqueName: \"kubernetes.io/projected/6fa75649-527c-47a4-8593-5e0d2a41c02b-kube-api-access-9wdt2\") pod \"6fa75649-527c-47a4-8593-5e0d2a41c02b\" (UID: \"6fa75649-527c-47a4-8593-5e0d2a41c02b\") " Apr 17 16:41:53.025694 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:53.025668 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fa75649-527c-47a4-8593-5e0d2a41c02b-kube-api-access-9wdt2" (OuterVolumeSpecName: "kube-api-access-9wdt2") pod "6fa75649-527c-47a4-8593-5e0d2a41c02b" (UID: "6fa75649-527c-47a4-8593-5e0d2a41c02b"). InnerVolumeSpecName "kube-api-access-9wdt2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:41:53.124353 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:53.124264 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9wdt2\" (UniqueName: \"kubernetes.io/projected/6fa75649-527c-47a4-8593-5e0d2a41c02b-kube-api-access-9wdt2\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:41:53.828594 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:53.828567 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-dm2gb" Apr 17 16:41:53.828781 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:53.828567 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-dm2gb" event={"ID":"6fa75649-527c-47a4-8593-5e0d2a41c02b","Type":"ContainerDied","Data":"950e31bbd716e8fa58ac972b603543cf997a27b4948d5b7a371d62085848497f"} Apr 17 16:41:53.828781 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:53.828669 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="950e31bbd716e8fa58ac972b603543cf997a27b4948d5b7a371d62085848497f" Apr 17 16:41:57.023780 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:57.023747 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-tjr6w"] Apr 17 16:41:57.024224 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:57.024079 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fa75649-527c-47a4-8593-5e0d2a41c02b" containerName="s3-tls-init-custom" Apr 17 16:41:57.024224 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:57.024096 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa75649-527c-47a4-8593-5e0d2a41c02b" containerName="s3-tls-init-custom" Apr 17 16:41:57.024224 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:57.024202 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fa75649-527c-47a4-8593-5e0d2a41c02b" containerName="s3-tls-init-custom" Apr 17 16:41:57.029750 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:57.029726 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-tjr6w" Apr 17 16:41:57.032279 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:57.032259 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 17 16:41:57.032416 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:57.032327 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-mdnc6\"" Apr 17 16:41:57.032956 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:57.032933 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-tjr6w"] Apr 17 16:41:57.054852 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:57.054820 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxtvr\" (UniqueName: \"kubernetes.io/projected/27c19e34-303e-4ae2-9926-095bf96fb8a4-kube-api-access-nxtvr\") pod \"s3-tls-init-serving-tjr6w\" (UID: \"27c19e34-303e-4ae2-9926-095bf96fb8a4\") " pod="kserve/s3-tls-init-serving-tjr6w" Apr 17 16:41:57.155832 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:57.155773 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxtvr\" (UniqueName: \"kubernetes.io/projected/27c19e34-303e-4ae2-9926-095bf96fb8a4-kube-api-access-nxtvr\") pod \"s3-tls-init-serving-tjr6w\" (UID: \"27c19e34-303e-4ae2-9926-095bf96fb8a4\") " pod="kserve/s3-tls-init-serving-tjr6w" Apr 17 16:41:57.163841 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:57.163797 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxtvr\" (UniqueName: \"kubernetes.io/projected/27c19e34-303e-4ae2-9926-095bf96fb8a4-kube-api-access-nxtvr\") pod \"s3-tls-init-serving-tjr6w\" (UID: \"27c19e34-303e-4ae2-9926-095bf96fb8a4\") " pod="kserve/s3-tls-init-serving-tjr6w" Apr 17 16:41:57.346917 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:57.346791 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-tjr6w" Apr 17 16:41:57.469904 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:57.469866 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-tjr6w"] Apr 17 16:41:57.472876 ip-10-0-136-182 kubenswrapper[2569]: W0417 16:41:57.472844 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27c19e34_303e_4ae2_9926_095bf96fb8a4.slice/crio-bbad9ac3a57231d3a342e19ea08bcf6bdcd2d3480530a8e1bb7f84836d107886 WatchSource:0}: Error finding container bbad9ac3a57231d3a342e19ea08bcf6bdcd2d3480530a8e1bb7f84836d107886: Status 404 returned error can't find the container with id bbad9ac3a57231d3a342e19ea08bcf6bdcd2d3480530a8e1bb7f84836d107886 Apr 17 16:41:57.841854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:57.841797 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-tjr6w" event={"ID":"27c19e34-303e-4ae2-9926-095bf96fb8a4","Type":"ContainerStarted","Data":"8a16921bd2d54ee1b9dfdec4bb0e23190fa77276916014f7d88aa807b2af02d6"} Apr 17 16:41:57.841854 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:57.841853 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-tjr6w" event={"ID":"27c19e34-303e-4ae2-9926-095bf96fb8a4","Type":"ContainerStarted","Data":"bbad9ac3a57231d3a342e19ea08bcf6bdcd2d3480530a8e1bb7f84836d107886"} Apr 17 16:41:57.857033 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:41:57.856982 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-tjr6w" podStartSLOduration=0.856966258 podStartE2EDuration="856.966258ms" podCreationTimestamp="2026-04-17 16:41:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:41:57.856714144 +0000 UTC m=+647.513521352" watchObservedRunningTime="2026-04-17 16:41:57.856966258 +0000 UTC m=+647.513773465" Apr 17 16:42:02.862861 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:42:02.862826 2569 generic.go:358] "Generic (PLEG): container finished" podID="27c19e34-303e-4ae2-9926-095bf96fb8a4" containerID="8a16921bd2d54ee1b9dfdec4bb0e23190fa77276916014f7d88aa807b2af02d6" exitCode=0 Apr 17 16:42:02.863223 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:42:02.862895 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-tjr6w" event={"ID":"27c19e34-303e-4ae2-9926-095bf96fb8a4","Type":"ContainerDied","Data":"8a16921bd2d54ee1b9dfdec4bb0e23190fa77276916014f7d88aa807b2af02d6"} Apr 17 16:42:03.997503 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:42:03.997480 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-tjr6w" Apr 17 16:42:04.112733 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:42:04.112696 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxtvr\" (UniqueName: \"kubernetes.io/projected/27c19e34-303e-4ae2-9926-095bf96fb8a4-kube-api-access-nxtvr\") pod \"27c19e34-303e-4ae2-9926-095bf96fb8a4\" (UID: \"27c19e34-303e-4ae2-9926-095bf96fb8a4\") " Apr 17 16:42:04.114815 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:42:04.114786 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27c19e34-303e-4ae2-9926-095bf96fb8a4-kube-api-access-nxtvr" (OuterVolumeSpecName: "kube-api-access-nxtvr") pod "27c19e34-303e-4ae2-9926-095bf96fb8a4" (UID: "27c19e34-303e-4ae2-9926-095bf96fb8a4"). InnerVolumeSpecName "kube-api-access-nxtvr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:42:04.213906 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:42:04.213868 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nxtvr\" (UniqueName: \"kubernetes.io/projected/27c19e34-303e-4ae2-9926-095bf96fb8a4-kube-api-access-nxtvr\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 16:42:04.870250 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:42:04.870210 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-tjr6w" event={"ID":"27c19e34-303e-4ae2-9926-095bf96fb8a4","Type":"ContainerDied","Data":"bbad9ac3a57231d3a342e19ea08bcf6bdcd2d3480530a8e1bb7f84836d107886"} Apr 17 16:42:04.870250 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:42:04.870231 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-tjr6w" Apr 17 16:42:04.870250 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:42:04.870241 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbad9ac3a57231d3a342e19ea08bcf6bdcd2d3480530a8e1bb7f84836d107886" Apr 17 16:46:10.900304 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:46:10.900228 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 16:46:10.902772 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:46:10.902752 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 16:51:10.923778 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:51:10.923750 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 16:51:10.927274 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:51:10.927253 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 16:56:10.947324 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:56:10.947293 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 16:56:10.949934 ip-10-0-136-182 kubenswrapper[2569]: I0417 16:56:10.949915 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 17:01:10.967745 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:01:10.967714 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 17:01:10.971313 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:01:10.971292 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 17:06:10.991988 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:06:10.991954 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 17:06:10.996955 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:06:10.996928 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 17:11:11.016102 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:11:11.016066 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 17:11:11.021842 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:11:11.021820 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 17:16:11.040604 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:16:11.040523 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 17:16:11.043976 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:16:11.043957 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 17:21:11.061675 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:21:11.061646 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 17:21:11.065891 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:21:11.065871 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 17:26:11.081485 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:26:11.081454 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 17:26:11.091301 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:26:11.091278 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 17:31:11.101628 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:31:11.101500 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 17:31:11.112740 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:31:11.112719 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 17:36:11.125457 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:36:11.125353 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 17:36:11.134689 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:36:11.134663 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 17:38:05.309956 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:05.309918 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g2xjp/must-gather-w9pzc"] Apr 17 17:38:05.310555 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:05.310369 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27c19e34-303e-4ae2-9926-095bf96fb8a4" containerName="s3-tls-init-serving" Apr 17 17:38:05.310555 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:05.310388 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="27c19e34-303e-4ae2-9926-095bf96fb8a4" containerName="s3-tls-init-serving" Apr 17 17:38:05.310555 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:05.310484 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="27c19e34-303e-4ae2-9926-095bf96fb8a4" containerName="s3-tls-init-serving" Apr 17 17:38:05.313323 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:05.313302 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g2xjp/must-gather-w9pzc" Apr 17 17:38:05.316041 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:05.316021 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g2xjp\"/\"kube-root-ca.crt\"" Apr 17 17:38:05.316221 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:05.316181 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g2xjp\"/\"openshift-service-ca.crt\"" Apr 17 17:38:05.316851 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:05.316834 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-g2xjp\"/\"default-dockercfg-h27th\"" Apr 17 17:38:05.325080 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:05.325058 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g2xjp/must-gather-w9pzc"] Apr 17 17:38:05.372330 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:05.372295 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frjxd\" (UniqueName: \"kubernetes.io/projected/8134d850-3306-457d-9804-9c765989de2a-kube-api-access-frjxd\") pod \"must-gather-w9pzc\" (UID: \"8134d850-3306-457d-9804-9c765989de2a\") " pod="openshift-must-gather-g2xjp/must-gather-w9pzc" Apr 17 17:38:05.372476 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:05.372346 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8134d850-3306-457d-9804-9c765989de2a-must-gather-output\") pod \"must-gather-w9pzc\" (UID: \"8134d850-3306-457d-9804-9c765989de2a\") " pod="openshift-must-gather-g2xjp/must-gather-w9pzc" Apr 17 17:38:05.473370 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:05.473335 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8134d850-3306-457d-9804-9c765989de2a-must-gather-output\") pod \"must-gather-w9pzc\" (UID: \"8134d850-3306-457d-9804-9c765989de2a\") " pod="openshift-must-gather-g2xjp/must-gather-w9pzc" Apr 17 17:38:05.473541 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:05.473411 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frjxd\" (UniqueName: \"kubernetes.io/projected/8134d850-3306-457d-9804-9c765989de2a-kube-api-access-frjxd\") pod \"must-gather-w9pzc\" (UID: \"8134d850-3306-457d-9804-9c765989de2a\") " pod="openshift-must-gather-g2xjp/must-gather-w9pzc" Apr 17 17:38:05.473761 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:05.473730 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8134d850-3306-457d-9804-9c765989de2a-must-gather-output\") pod \"must-gather-w9pzc\" (UID: \"8134d850-3306-457d-9804-9c765989de2a\") " pod="openshift-must-gather-g2xjp/must-gather-w9pzc" Apr 17 17:38:05.482175 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:05.482144 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frjxd\" (UniqueName: \"kubernetes.io/projected/8134d850-3306-457d-9804-9c765989de2a-kube-api-access-frjxd\") pod \"must-gather-w9pzc\" (UID: \"8134d850-3306-457d-9804-9c765989de2a\") " pod="openshift-must-gather-g2xjp/must-gather-w9pzc" Apr 17 17:38:05.633377 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:05.633295 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g2xjp/must-gather-w9pzc" Apr 17 17:38:05.752772 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:05.752625 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g2xjp/must-gather-w9pzc"] Apr 17 17:38:05.755656 ip-10-0-136-182 kubenswrapper[2569]: W0417 17:38:05.755621 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8134d850_3306_457d_9804_9c765989de2a.slice/crio-bf866178185eb20f02ef454aeb8777f3b3b214404759c192d6b76213df9e7087 WatchSource:0}: Error finding container bf866178185eb20f02ef454aeb8777f3b3b214404759c192d6b76213df9e7087: Status 404 returned error can't find the container with id bf866178185eb20f02ef454aeb8777f3b3b214404759c192d6b76213df9e7087 Apr 17 17:38:05.757766 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:05.757752 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:38:06.562186 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:06.562135 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g2xjp/must-gather-w9pzc" event={"ID":"8134d850-3306-457d-9804-9c765989de2a","Type":"ContainerStarted","Data":"bf866178185eb20f02ef454aeb8777f3b3b214404759c192d6b76213df9e7087"} Apr 17 17:38:10.578152 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:10.578089 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g2xjp/must-gather-w9pzc" event={"ID":"8134d850-3306-457d-9804-9c765989de2a","Type":"ContainerStarted","Data":"81f33464d966bbefd5a0e14aba0d40bf9984deebe194b562bc5284a1225699ae"} Apr 17 17:38:11.583225 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:11.583184 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g2xjp/must-gather-w9pzc" event={"ID":"8134d850-3306-457d-9804-9c765989de2a","Type":"ContainerStarted","Data":"6ed20d90e97d2d09f0d015b261832f2aefa98854d559f5772ea3b6a6ff3a6442"} Apr 17 17:38:11.604353 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:11.604296 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g2xjp/must-gather-w9pzc" podStartSLOduration=1.9806504569999999 podStartE2EDuration="6.604281217s" podCreationTimestamp="2026-04-17 17:38:05 +0000 UTC" firstStartedPulling="2026-04-17 17:38:05.757894976 +0000 UTC m=+4015.414702162" lastFinishedPulling="2026-04-17 17:38:10.381525732 +0000 UTC m=+4020.038332922" observedRunningTime="2026-04-17 17:38:11.601754265 +0000 UTC m=+4021.258561479" watchObservedRunningTime="2026-04-17 17:38:11.604281217 +0000 UTC m=+4021.261088424" Apr 17 17:38:32.646861 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:32.646826 2569 generic.go:358] "Generic (PLEG): container finished" podID="8134d850-3306-457d-9804-9c765989de2a" containerID="81f33464d966bbefd5a0e14aba0d40bf9984deebe194b562bc5284a1225699ae" exitCode=0 Apr 17 17:38:32.647267 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:32.646891 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g2xjp/must-gather-w9pzc" event={"ID":"8134d850-3306-457d-9804-9c765989de2a","Type":"ContainerDied","Data":"81f33464d966bbefd5a0e14aba0d40bf9984deebe194b562bc5284a1225699ae"} Apr 17 17:38:32.647267 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:32.647239 2569 scope.go:117] "RemoveContainer" containerID="81f33464d966bbefd5a0e14aba0d40bf9984deebe194b562bc5284a1225699ae" Apr 17 17:38:33.280858 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:33.280794 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g2xjp_must-gather-w9pzc_8134d850-3306-457d-9804-9c765989de2a/gather/0.log" Apr 17 17:38:33.932032 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:33.932000 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5lbcd/must-gather-kcklb"] Apr 17 17:38:33.935113 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:33.935096 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lbcd/must-gather-kcklb" Apr 17 17:38:33.937443 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:33.937418 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5lbcd\"/\"default-dockercfg-r4zks\"" Apr 17 17:38:33.938360 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:33.938339 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5lbcd\"/\"kube-root-ca.crt\"" Apr 17 17:38:33.938469 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:33.938361 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5lbcd\"/\"openshift-service-ca.crt\"" Apr 17 17:38:33.943049 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:33.943028 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5lbcd/must-gather-kcklb"] Apr 17 17:38:34.030565 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:34.030527 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c059efce-05c1-475c-bc3e-886103252f4e-must-gather-output\") pod \"must-gather-kcklb\" (UID: \"c059efce-05c1-475c-bc3e-886103252f4e\") " pod="openshift-must-gather-5lbcd/must-gather-kcklb" Apr 17 17:38:34.030744 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:34.030718 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfrsr\" (UniqueName: \"kubernetes.io/projected/c059efce-05c1-475c-bc3e-886103252f4e-kube-api-access-mfrsr\") pod \"must-gather-kcklb\" (UID: \"c059efce-05c1-475c-bc3e-886103252f4e\") " pod="openshift-must-gather-5lbcd/must-gather-kcklb" Apr 17 17:38:34.131945 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:34.131912 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c059efce-05c1-475c-bc3e-886103252f4e-must-gather-output\") pod \"must-gather-kcklb\" (UID: \"c059efce-05c1-475c-bc3e-886103252f4e\") " pod="openshift-must-gather-5lbcd/must-gather-kcklb" Apr 17 17:38:34.132122 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:34.131980 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfrsr\" (UniqueName: \"kubernetes.io/projected/c059efce-05c1-475c-bc3e-886103252f4e-kube-api-access-mfrsr\") pod \"must-gather-kcklb\" (UID: \"c059efce-05c1-475c-bc3e-886103252f4e\") " pod="openshift-must-gather-5lbcd/must-gather-kcklb" Apr 17 17:38:34.132257 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:34.132239 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c059efce-05c1-475c-bc3e-886103252f4e-must-gather-output\") pod \"must-gather-kcklb\" (UID: \"c059efce-05c1-475c-bc3e-886103252f4e\") " pod="openshift-must-gather-5lbcd/must-gather-kcklb" Apr 17 17:38:34.139719 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:34.139694 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfrsr\" (UniqueName: \"kubernetes.io/projected/c059efce-05c1-475c-bc3e-886103252f4e-kube-api-access-mfrsr\") pod \"must-gather-kcklb\" (UID: \"c059efce-05c1-475c-bc3e-886103252f4e\") " pod="openshift-must-gather-5lbcd/must-gather-kcklb" Apr 17 17:38:34.244923 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:34.244834 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lbcd/must-gather-kcklb" Apr 17 17:38:34.363211 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:34.363177 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5lbcd/must-gather-kcklb"] Apr 17 17:38:34.366129 ip-10-0-136-182 kubenswrapper[2569]: W0417 17:38:34.366105 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc059efce_05c1_475c_bc3e_886103252f4e.slice/crio-ffc479ae0cebbb7bf18962a64d42163122c49924b8262cbf9799357601745c76 WatchSource:0}: Error finding container ffc479ae0cebbb7bf18962a64d42163122c49924b8262cbf9799357601745c76: Status 404 returned error can't find the container with id ffc479ae0cebbb7bf18962a64d42163122c49924b8262cbf9799357601745c76 Apr 17 17:38:34.652837 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:34.652787 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lbcd/must-gather-kcklb" event={"ID":"c059efce-05c1-475c-bc3e-886103252f4e","Type":"ContainerStarted","Data":"ffc479ae0cebbb7bf18962a64d42163122c49924b8262cbf9799357601745c76"} Apr 17 17:38:35.658792 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:35.658236 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lbcd/must-gather-kcklb" event={"ID":"c059efce-05c1-475c-bc3e-886103252f4e","Type":"ContainerStarted","Data":"de2d1ff9f87a262ecb4e4604ffb3018b3ab6f288619368cf87f94b6568d3c42a"} Apr 17 17:38:35.658792 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:35.658283 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lbcd/must-gather-kcklb" event={"ID":"c059efce-05c1-475c-bc3e-886103252f4e","Type":"ContainerStarted","Data":"f2153eac92ff718fd8715f881785be5c9603bc7c3a35481b0fddd43fe4c9556d"} Apr 17 17:38:35.674147 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:35.673867 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5lbcd/must-gather-kcklb" podStartSLOduration=1.8960698790000001 podStartE2EDuration="2.67384723s" podCreationTimestamp="2026-04-17 17:38:33 +0000 UTC" firstStartedPulling="2026-04-17 17:38:34.367773088 +0000 UTC m=+4044.024580277" lastFinishedPulling="2026-04-17 17:38:35.145550431 +0000 UTC m=+4044.802357628" observedRunningTime="2026-04-17 17:38:35.672602958 +0000 UTC m=+4045.329410171" watchObservedRunningTime="2026-04-17 17:38:35.67384723 +0000 UTC m=+4045.330654438" Apr 17 17:38:36.497362 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:36.497330 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-5vcm7_63fffba4-9c72-4652-811d-ebe876a5f73b/global-pull-secret-syncer/0.log" Apr 17 17:38:36.793694 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:36.793654 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xcmwf_d56445ed-a7d8-44cf-b91d-6cd2dcb4d6bc/konnectivity-agent/0.log" Apr 17 17:38:36.848850 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:36.848792 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-182.ec2.internal_8d86482bf3f913f02d52bdaa697da1d7/haproxy/0.log" Apr 17 17:38:38.785333 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:38.785286 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-g2xjp/must-gather-w9pzc"] Apr 17 17:38:38.785907 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:38.785576 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-g2xjp/must-gather-w9pzc" podUID="8134d850-3306-457d-9804-9c765989de2a" containerName="copy" containerID="cri-o://6ed20d90e97d2d09f0d015b261832f2aefa98854d559f5772ea3b6a6ff3a6442" gracePeriod=2 Apr 17 17:38:38.788041 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:38.788005 2569 status_manager.go:895] "Failed to get status for pod" podUID="8134d850-3306-457d-9804-9c765989de2a" pod="openshift-must-gather-g2xjp/must-gather-w9pzc" err="pods \"must-gather-w9pzc\" is forbidden: User \"system:node:ip-10-0-136-182.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-g2xjp\": no relationship found between node 'ip-10-0-136-182.ec2.internal' and this object" Apr 17 17:38:38.789114 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:38.789088 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-g2xjp/must-gather-w9pzc"] Apr 17 17:38:39.218512 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:39.218430 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g2xjp_must-gather-w9pzc_8134d850-3306-457d-9804-9c765989de2a/copy/0.log" Apr 17 17:38:39.219192 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:39.219066 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g2xjp/must-gather-w9pzc" Apr 17 17:38:39.400836 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:39.400117 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8134d850-3306-457d-9804-9c765989de2a-must-gather-output\") pod \"8134d850-3306-457d-9804-9c765989de2a\" (UID: \"8134d850-3306-457d-9804-9c765989de2a\") " Apr 17 17:38:39.400836 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:39.400204 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frjxd\" (UniqueName: \"kubernetes.io/projected/8134d850-3306-457d-9804-9c765989de2a-kube-api-access-frjxd\") pod \"8134d850-3306-457d-9804-9c765989de2a\" (UID: \"8134d850-3306-457d-9804-9c765989de2a\") " Apr 17 17:38:39.402103 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:39.402065 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8134d850-3306-457d-9804-9c765989de2a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8134d850-3306-457d-9804-9c765989de2a" (UID: "8134d850-3306-457d-9804-9c765989de2a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:39.414135 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:39.414070 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8134d850-3306-457d-9804-9c765989de2a-kube-api-access-frjxd" (OuterVolumeSpecName: "kube-api-access-frjxd") pod "8134d850-3306-457d-9804-9c765989de2a" (UID: "8134d850-3306-457d-9804-9c765989de2a"). InnerVolumeSpecName "kube-api-access-frjxd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:38:39.501122 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:39.501080 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-frjxd\" (UniqueName: \"kubernetes.io/projected/8134d850-3306-457d-9804-9c765989de2a-kube-api-access-frjxd\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 17:38:39.501122 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:39.501122 2569 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8134d850-3306-457d-9804-9c765989de2a-must-gather-output\") on node \"ip-10-0-136-182.ec2.internal\" DevicePath \"\"" Apr 17 17:38:39.673508 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:39.673381 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g2xjp_must-gather-w9pzc_8134d850-3306-457d-9804-9c765989de2a/copy/0.log" Apr 17 17:38:39.677831 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:39.674111 2569 generic.go:358] "Generic (PLEG): container finished" podID="8134d850-3306-457d-9804-9c765989de2a" containerID="6ed20d90e97d2d09f0d015b261832f2aefa98854d559f5772ea3b6a6ff3a6442" exitCode=143 Apr 17 17:38:39.677831 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:39.674243 2569 scope.go:117] "RemoveContainer" containerID="6ed20d90e97d2d09f0d015b261832f2aefa98854d559f5772ea3b6a6ff3a6442" Apr 17 17:38:39.677831 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:39.674385 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g2xjp/must-gather-w9pzc" Apr 17 17:38:39.699833 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:39.699793 2569 scope.go:117] "RemoveContainer" containerID="81f33464d966bbefd5a0e14aba0d40bf9984deebe194b562bc5284a1225699ae" Apr 17 17:38:39.732536 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:39.732439 2569 scope.go:117] "RemoveContainer" containerID="6ed20d90e97d2d09f0d015b261832f2aefa98854d559f5772ea3b6a6ff3a6442" Apr 17 17:38:39.733702 ip-10-0-136-182 kubenswrapper[2569]: E0417 17:38:39.733657 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ed20d90e97d2d09f0d015b261832f2aefa98854d559f5772ea3b6a6ff3a6442\": container with ID starting with 6ed20d90e97d2d09f0d015b261832f2aefa98854d559f5772ea3b6a6ff3a6442 not found: ID does not exist" containerID="6ed20d90e97d2d09f0d015b261832f2aefa98854d559f5772ea3b6a6ff3a6442" Apr 17 17:38:39.733848 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:39.733702 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ed20d90e97d2d09f0d015b261832f2aefa98854d559f5772ea3b6a6ff3a6442"} err="failed to get container status \"6ed20d90e97d2d09f0d015b261832f2aefa98854d559f5772ea3b6a6ff3a6442\": rpc error: code = NotFound desc = could not find container \"6ed20d90e97d2d09f0d015b261832f2aefa98854d559f5772ea3b6a6ff3a6442\": container with ID starting with 6ed20d90e97d2d09f0d015b261832f2aefa98854d559f5772ea3b6a6ff3a6442 not found: ID does not exist" Apr 17 17:38:39.733848 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:39.733727 2569 scope.go:117] "RemoveContainer" containerID="81f33464d966bbefd5a0e14aba0d40bf9984deebe194b562bc5284a1225699ae" Apr 17 17:38:39.734176 ip-10-0-136-182 kubenswrapper[2569]: E0417 17:38:39.734149 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81f33464d966bbefd5a0e14aba0d40bf9984deebe194b562bc5284a1225699ae\": container with ID starting with 81f33464d966bbefd5a0e14aba0d40bf9984deebe194b562bc5284a1225699ae not found: ID does not exist" containerID="81f33464d966bbefd5a0e14aba0d40bf9984deebe194b562bc5284a1225699ae" Apr 17 17:38:39.734258 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:39.734185 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f33464d966bbefd5a0e14aba0d40bf9984deebe194b562bc5284a1225699ae"} err="failed to get container status \"81f33464d966bbefd5a0e14aba0d40bf9984deebe194b562bc5284a1225699ae\": rpc error: code = NotFound desc = could not find container \"81f33464d966bbefd5a0e14aba0d40bf9984deebe194b562bc5284a1225699ae\": container with ID starting with 81f33464d966bbefd5a0e14aba0d40bf9984deebe194b562bc5284a1225699ae not found: ID does not exist" Apr 17 17:38:40.296199 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:40.296158 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-cb655_73c01fa2-0bb5-4830-8566-287275e3788e/kube-state-metrics/0.log" Apr 17 17:38:40.325839 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:40.325777 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-cb655_73c01fa2-0bb5-4830-8566-287275e3788e/kube-rbac-proxy-main/0.log" Apr 17 17:38:40.360465 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:40.360382 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-cb655_73c01fa2-0bb5-4830-8566-287275e3788e/kube-rbac-proxy-self/0.log" Apr 17 17:38:40.426642 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:40.426602 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-8stz2_eb81ba0c-4f9c-47f0-be8d-5c88e55e8ceb/monitoring-plugin/0.log" Apr 17 17:38:40.624237 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:40.624208 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qzpx4_e9bdf773-6e19-473e-91a4-6e5e975799cc/node-exporter/0.log" Apr 17 17:38:40.654780 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:40.654747 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qzpx4_e9bdf773-6e19-473e-91a4-6e5e975799cc/kube-rbac-proxy/0.log" Apr 17 17:38:40.684899 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:40.684864 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qzpx4_e9bdf773-6e19-473e-91a4-6e5e975799cc/init-textfile/0.log" Apr 17 17:38:40.716177 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:40.716147 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5dn2r_2c1b1707-f1e8-419a-bb77-e8d261512299/kube-rbac-proxy-main/0.log" Apr 17 17:38:40.745295 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:40.745263 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5dn2r_2c1b1707-f1e8-419a-bb77-e8d261512299/kube-rbac-proxy-self/0.log" Apr 17 17:38:40.770469 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:40.770434 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5dn2r_2c1b1707-f1e8-419a-bb77-e8d261512299/openshift-state-metrics/0.log" Apr 17 17:38:40.821211 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:40.821177 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_28c4d7cd-c381-4e90-8fc2-82cbb0c46b07/prometheus/0.log" Apr 17 17:38:40.843284 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:40.843252 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_28c4d7cd-c381-4e90-8fc2-82cbb0c46b07/config-reloader/0.log" Apr 17 17:38:40.872716 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:40.872686 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_28c4d7cd-c381-4e90-8fc2-82cbb0c46b07/thanos-sidecar/0.log" Apr 17 17:38:40.899160 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:40.899073 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_28c4d7cd-c381-4e90-8fc2-82cbb0c46b07/kube-rbac-proxy-web/0.log" Apr 17 17:38:40.916433 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:40.916389 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8134d850-3306-457d-9804-9c765989de2a" path="/var/lib/kubelet/pods/8134d850-3306-457d-9804-9c765989de2a/volumes" Apr 17 17:38:40.924759 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:40.924729 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_28c4d7cd-c381-4e90-8fc2-82cbb0c46b07/kube-rbac-proxy/0.log" Apr 17 17:38:40.951847 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:40.951821 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_28c4d7cd-c381-4e90-8fc2-82cbb0c46b07/kube-rbac-proxy-thanos/0.log" Apr 17 17:38:40.976866 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:40.976826 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_28c4d7cd-c381-4e90-8fc2-82cbb0c46b07/init-config-reloader/0.log" Apr 17 17:38:41.070387 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:41.070346 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-gqjm6_4608d06d-372e-4545-b3a1-6275d8b88c82/prometheus-operator-admission-webhook/0.log" Apr 17 17:38:42.481506 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:42.481477 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-6svzp_8dec519d-f277-4d63-80ab-1edcb2ec1275/networking-console-plugin/0.log" Apr 17 17:38:43.332650 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.332620 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-wn6rg_f8e80855-0ca0-4413-990c-7bfef055921c/download-server/0.log" Apr 17 17:38:43.792020 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.791985 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp"] Apr 17 17:38:43.792496 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.792472 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8134d850-3306-457d-9804-9c765989de2a" containerName="copy" Apr 17 17:38:43.792662 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.792649 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8134d850-3306-457d-9804-9c765989de2a" containerName="copy" Apr 17 17:38:43.792763 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.792751 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8134d850-3306-457d-9804-9c765989de2a" containerName="gather" Apr 17 17:38:43.792882 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.792868 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8134d850-3306-457d-9804-9c765989de2a" containerName="gather" Apr 17 17:38:43.793070 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.793058 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8134d850-3306-457d-9804-9c765989de2a" containerName="copy" Apr 17 17:38:43.793162 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.793152 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8134d850-3306-457d-9804-9c765989de2a" containerName="gather" Apr 17 17:38:43.798039 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.798018 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:43.804135 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.804105 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp"] Apr 17 17:38:43.850841 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.850780 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/12622849-4582-4c24-af80-95e57ad4f123-proc\") pod \"perf-node-gather-daemonset-n74bp\" (UID: \"12622849-4582-4c24-af80-95e57ad4f123\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:43.851030 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.850899 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/12622849-4582-4c24-af80-95e57ad4f123-podres\") pod \"perf-node-gather-daemonset-n74bp\" (UID: \"12622849-4582-4c24-af80-95e57ad4f123\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:43.851030 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.850979 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12622849-4582-4c24-af80-95e57ad4f123-sys\") pod \"perf-node-gather-daemonset-n74bp\" (UID: \"12622849-4582-4c24-af80-95e57ad4f123\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:43.851030 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.851020 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12622849-4582-4c24-af80-95e57ad4f123-lib-modules\") pod \"perf-node-gather-daemonset-n74bp\" (UID: \"12622849-4582-4c24-af80-95e57ad4f123\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:43.851192 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.851072 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpkfq\" (UniqueName: \"kubernetes.io/projected/12622849-4582-4c24-af80-95e57ad4f123-kube-api-access-dpkfq\") pod \"perf-node-gather-daemonset-n74bp\" (UID: \"12622849-4582-4c24-af80-95e57ad4f123\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:43.951824 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.951767 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpkfq\" (UniqueName: \"kubernetes.io/projected/12622849-4582-4c24-af80-95e57ad4f123-kube-api-access-dpkfq\") pod \"perf-node-gather-daemonset-n74bp\" (UID: \"12622849-4582-4c24-af80-95e57ad4f123\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:43.952016 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.951862 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/12622849-4582-4c24-af80-95e57ad4f123-proc\") pod \"perf-node-gather-daemonset-n74bp\" (UID: \"12622849-4582-4c24-af80-95e57ad4f123\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:43.952016 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.951898 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/12622849-4582-4c24-af80-95e57ad4f123-podres\") pod \"perf-node-gather-daemonset-n74bp\" (UID: \"12622849-4582-4c24-af80-95e57ad4f123\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:43.952016 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.951954 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12622849-4582-4c24-af80-95e57ad4f123-sys\") pod \"perf-node-gather-daemonset-n74bp\" (UID: \"12622849-4582-4c24-af80-95e57ad4f123\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:43.952016 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.951989 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12622849-4582-4c24-af80-95e57ad4f123-lib-modules\") pod \"perf-node-gather-daemonset-n74bp\" (UID: \"12622849-4582-4c24-af80-95e57ad4f123\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:43.952241 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.952164 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12622849-4582-4c24-af80-95e57ad4f123-lib-modules\") pod \"perf-node-gather-daemonset-n74bp\" (UID: \"12622849-4582-4c24-af80-95e57ad4f123\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:43.952542 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.952516 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/12622849-4582-4c24-af80-95e57ad4f123-proc\") pod \"perf-node-gather-daemonset-n74bp\" (UID: \"12622849-4582-4c24-af80-95e57ad4f123\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:43.952640 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.952624 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/12622849-4582-4c24-af80-95e57ad4f123-podres\") pod \"perf-node-gather-daemonset-n74bp\" (UID: \"12622849-4582-4c24-af80-95e57ad4f123\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:43.952699 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.952675 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12622849-4582-4c24-af80-95e57ad4f123-sys\") pod \"perf-node-gather-daemonset-n74bp\" (UID: \"12622849-4582-4c24-af80-95e57ad4f123\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:43.961336 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:43.961302 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpkfq\" (UniqueName: \"kubernetes.io/projected/12622849-4582-4c24-af80-95e57ad4f123-kube-api-access-dpkfq\") pod \"perf-node-gather-daemonset-n74bp\" (UID: \"12622849-4582-4c24-af80-95e57ad4f123\") " pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:44.114443 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:44.114356 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:44.235150 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:44.235115 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp"] Apr 17 17:38:44.238667 ip-10-0-136-182 kubenswrapper[2569]: W0417 17:38:44.238638 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod12622849_4582_4c24_af80_95e57ad4f123.slice/crio-da275363f67eb868076ba2d89103b16f4fb24ff8e26b120f9648512b09e960dd WatchSource:0}: Error finding container da275363f67eb868076ba2d89103b16f4fb24ff8e26b120f9648512b09e960dd: Status 404 returned error can't find the container with id da275363f67eb868076ba2d89103b16f4fb24ff8e26b120f9648512b09e960dd Apr 17 17:38:44.499441 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:44.499371 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-scnk5_97268ee4-4d26-4550-8a7e-78ee5bbc45c0/dns/0.log" Apr 17 17:38:44.521689 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:44.521648 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-scnk5_97268ee4-4d26-4550-8a7e-78ee5bbc45c0/kube-rbac-proxy/0.log" Apr 17 17:38:44.593784 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:44.593755 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qskd2_c855efcb-8d09-4e04-8063-d5bb5ae67dc3/dns-node-resolver/0.log" Apr 17 17:38:44.696639 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:44.696605 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" event={"ID":"12622849-4582-4c24-af80-95e57ad4f123","Type":"ContainerStarted","Data":"c74e57d4c2a7ef383e9ed4dd40aca53736cccb553ce8ec23189acce65d631967"} Apr 17 17:38:44.696639 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:44.696644 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" event={"ID":"12622849-4582-4c24-af80-95e57ad4f123","Type":"ContainerStarted","Data":"da275363f67eb868076ba2d89103b16f4fb24ff8e26b120f9648512b09e960dd"} Apr 17 17:38:44.696862 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:44.696752 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:44.723378 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:44.723329 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" podStartSLOduration=1.7233126140000001 podStartE2EDuration="1.723312614s" podCreationTimestamp="2026-04-17 17:38:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:38:44.722493257 +0000 UTC m=+4054.379300466" watchObservedRunningTime="2026-04-17 17:38:44.723312614 +0000 UTC m=+4054.380119821" Apr 17 17:38:45.092253 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:45.092217 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sfzc4_5c3742c8-dbba-43ea-99fc-4321ab7b1156/node-ca/0.log" Apr 17 17:38:46.128848 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:46.128797 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7bhrv_db105214-70a5-4d57-b705-6d896bd0f8a3/serve-healthcheck-canary/0.log" Apr 17 17:38:46.704545 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:46.704514 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bnmth_85668985-17d4-4c12-88cd-1204c9fd0791/kube-rbac-proxy/0.log" Apr 17 17:38:46.726866 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:46.726826 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bnmth_85668985-17d4-4c12-88cd-1204c9fd0791/exporter/0.log" Apr 17 17:38:46.751608 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:46.751583 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bnmth_85668985-17d4-4c12-88cd-1204c9fd0791/extractor/0.log" Apr 17 17:38:48.937751 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:48.937715 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-9f6kf_94324528-2d75-4d3c-9ada-a1a268e2ace9/server/0.log" Apr 17 17:38:49.342776 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:49.342749 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-6hnc5_22de6498-3617-4d8a-b5a6-261cb73d92b4/s3-init/0.log" Apr 17 17:38:49.366026 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:49.365996 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-dm2gb_6fa75649-527c-47a4-8593-5e0d2a41c02b/s3-tls-init-custom/0.log" Apr 17 17:38:49.387715 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:49.387683 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-tjr6w_27c19e34-303e-4ae2-9926-095bf96fb8a4/s3-tls-init-serving/0.log" Apr 17 17:38:50.711695 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:50.711661 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5lbcd/perf-node-gather-daemonset-n74bp" Apr 17 17:38:53.784153 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:53.784078 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-zlzch_f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068/kube-storage-version-migrator-operator/1.log" Apr 17 17:38:53.786549 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:53.786506 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-zlzch_f4bc3e8d-bd1c-4102-aab9-bacaaeb0e068/kube-storage-version-migrator-operator/0.log" Apr 17 17:38:54.985708 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:54.985676 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fvk4x_4415d384-34d0-412a-8d4e-c8f3077b28f5/kube-multus-additional-cni-plugins/0.log" Apr 17 17:38:55.009367 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:55.009338 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fvk4x_4415d384-34d0-412a-8d4e-c8f3077b28f5/egress-router-binary-copy/0.log" Apr 17 17:38:55.030918 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:55.030876 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fvk4x_4415d384-34d0-412a-8d4e-c8f3077b28f5/cni-plugins/0.log" Apr 17 17:38:55.055322 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:55.055251 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fvk4x_4415d384-34d0-412a-8d4e-c8f3077b28f5/bond-cni-plugin/0.log" Apr 17 17:38:55.086776 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:55.086745 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fvk4x_4415d384-34d0-412a-8d4e-c8f3077b28f5/routeoverride-cni/0.log" Apr 17 17:38:55.120954 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:55.120930 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fvk4x_4415d384-34d0-412a-8d4e-c8f3077b28f5/whereabouts-cni-bincopy/0.log" Apr 17 17:38:55.150771 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:55.150746 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fvk4x_4415d384-34d0-412a-8d4e-c8f3077b28f5/whereabouts-cni/0.log" Apr 17 17:38:55.223478 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:55.223444 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jdjbz_611a935a-b080-4fc7-bb9b-c51d000c8434/kube-multus/0.log" Apr 17 17:38:55.294570 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:55.294532 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-njmgk_79adaa92-9fae-4abb-b9ae-335440dbe8f1/network-metrics-daemon/0.log" Apr 17 17:38:55.318470 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:55.318393 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-njmgk_79adaa92-9fae-4abb-b9ae-335440dbe8f1/kube-rbac-proxy/0.log" Apr 17 17:38:56.118487 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:56.118454 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-controller/0.log" Apr 17 17:38:56.148398 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:56.148367 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/0.log" Apr 17 17:38:56.184622 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:56.184588 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovn-acl-logging/1.log" Apr 17 17:38:56.212355 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:56.212316 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/kube-rbac-proxy-node/0.log" Apr 17 17:38:56.244289 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:56.244262 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 17:38:56.267335 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:56.267297 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/northd/0.log" Apr 17 17:38:56.296563 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:56.296539 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/nbdb/0.log" Apr 17 17:38:56.320092 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:56.320054 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/sbdb/0.log" Apr 17 17:38:56.534239 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:56.534194 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2cjj2_56d56159-cb46-4e72-b5a0-a94c8cc6452d/ovnkube-controller/0.log" Apr 17 17:38:58.249538 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:58.249508 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-2nzr7_b47a469b-6847-4d3a-b7a8-64dad8f5db47/check-endpoints/0.log" Apr 17 17:38:58.270412 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:58.270391 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-454rf_cf2a6beb-fbfb-4062-b87b-a178033b242c/network-check-target-container/0.log" Apr 17 17:38:59.210842 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:59.210794 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-qcf6j_d70a0068-8f05-43f8-965d-fae1fcc76d7c/iptables-alerter/0.log" Apr 17 17:38:59.907931 ip-10-0-136-182 kubenswrapper[2569]: I0417 17:38:59.907899 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-h97kv_5fc75060-d6e4-4b7a-bba7-b763cd3b68a1/tuned/0.log"