Apr 24 16:39:07.408218 ip-10-0-143-144 systemd[1]: Starting Kubernetes Kubelet... Apr 24 16:39:07.908773 ip-10-0-143-144 kubenswrapper[2562]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:39:07.908773 ip-10-0-143-144 kubenswrapper[2562]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 16:39:07.908773 ip-10-0-143-144 kubenswrapper[2562]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:39:07.908773 ip-10-0-143-144 kubenswrapper[2562]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 16:39:07.908773 ip-10-0-143-144 kubenswrapper[2562]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:39:07.910452 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.910362 2562 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 16:39:07.915036 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915018 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:07.915036 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915034 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:07.915036 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915038 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915041 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915045 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915048 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915050 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915053 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915056 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915059 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915061 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915064 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915067 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915070 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915073 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915075 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915084 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915089 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915092 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915095 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915097 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915100 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:07.915134 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915102 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915105 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915107 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915110 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915113 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915115 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915118 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915121 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915124 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915126 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915129 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915131 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915134 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915137 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915139 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915143 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915147 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915150 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915153 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915155 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:07.915601 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915158 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915161 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915164 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915166 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915169 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915171 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915174 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915176 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915179 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915181 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915184 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915187 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915189 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915192 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915195 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915198 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915201 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915203 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915206 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915209 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:07.916103 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915211 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915214 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915217 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915219 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915222 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915224 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915227 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915229 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915232 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915235 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915238 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915241 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915243 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915247 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915252 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915255 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915258 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915261 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915264 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:07.916583 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915267 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915270 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915273 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915275 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915279 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915663 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915668 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915673 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915676 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915679 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915682 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915685 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915688 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915691 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915693 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915696 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915698 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915701 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915704 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915706 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:07.917050 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915709 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915712 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915714 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915717 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915720 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915723 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915725 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915728 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915730 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915732 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915735 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915737 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915740 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915742 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915745 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915747 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915750 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915753 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915755 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915759 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:07.917518 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915763 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915767 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915770 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915773 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915776 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915778 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915781 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915783 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915786 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915788 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915791 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915793 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915796 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915798 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915801 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915803 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915807 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915809 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915811 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:07.918028 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915814 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915817 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915819 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915822 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915825 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915827 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915830 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915832 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915835 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915838 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915841 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915843 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915846 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915849 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915852 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915856 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915860 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915863 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915866 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:07.918487 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915870 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915872 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915875 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915878 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915880 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915883 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915886 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915888 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915891 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915893 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915896 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915899 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.915902 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.915990 2562 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.915997 2562 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916005 2562 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916009 2562 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916013 2562 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916017 2562 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916021 2562 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916026 2562 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 16:39:07.918967 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916029 2562 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916033 2562 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916036 2562 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916040 2562 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916043 2562 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916046 2562 flags.go:64] FLAG: --cgroup-root="" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916049 2562 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916052 2562 flags.go:64] FLAG: --client-ca-file="" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916055 2562 flags.go:64] FLAG: --cloud-config="" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916058 2562 flags.go:64] FLAG: --cloud-provider="external" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916061 2562 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916065 2562 flags.go:64] FLAG: --cluster-domain="" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916071 2562 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916075 2562 flags.go:64] FLAG: --config-dir="" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916077 2562 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916081 2562 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916085 2562 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916088 2562 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916091 2562 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916095 2562 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916098 2562 flags.go:64] FLAG: --contention-profiling="false" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916101 2562 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916104 2562 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916108 2562 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916112 2562 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 16:39:07.919565 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916116 2562 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916119 2562 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916122 2562 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916125 2562 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916128 2562 flags.go:64] FLAG: --enable-server="true" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916131 2562 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916136 2562 flags.go:64] FLAG: --event-burst="100" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916139 2562 flags.go:64] FLAG: --event-qps="50" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916142 2562 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916146 2562 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916148 2562 flags.go:64] FLAG: --eviction-hard="" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916152 2562 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916155 2562 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916158 2562 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916161 2562 flags.go:64] FLAG: --eviction-soft="" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916164 2562 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916167 2562 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916171 2562 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916173 2562 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916177 2562 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916180 2562 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916183 2562 flags.go:64] FLAG: --feature-gates="" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916187 2562 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916190 2562 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916193 2562 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 16:39:07.920199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916196 2562 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916199 2562 flags.go:64] FLAG: --healthz-port="10248" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916202 2562 flags.go:64] FLAG: --help="false" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916205 2562 flags.go:64] FLAG: --hostname-override="ip-10-0-143-144.ec2.internal" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916208 2562 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916211 2562 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916215 2562 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916219 2562 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916222 2562 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916225 2562 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916228 2562 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916231 2562 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916234 2562 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916240 2562 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916243 2562 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916246 2562 flags.go:64] FLAG: --kube-reserved="" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916249 2562 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916252 2562 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916255 2562 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916258 2562 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916261 2562 flags.go:64] FLAG: --lock-file="" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916264 2562 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916267 2562 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916270 2562 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 16:39:07.920797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916276 2562 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916278 2562 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916282 2562 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916285 2562 flags.go:64] FLAG: --logging-format="text" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916288 2562 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916292 2562 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916295 2562 flags.go:64] FLAG: --manifest-url="" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916298 2562 flags.go:64] FLAG: --manifest-url-header="" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916302 2562 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916306 2562 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916310 2562 flags.go:64] FLAG: --max-pods="110" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916313 2562 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916316 2562 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916319 2562 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916322 2562 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916326 2562 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916329 2562 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916332 2562 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916339 2562 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916343 2562 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916345 2562 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916350 2562 flags.go:64] FLAG: --pod-cidr="" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916353 2562 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 16:39:07.921424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916359 2562 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916362 2562 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916365 2562 flags.go:64] FLAG: --pods-per-core="0" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916368 2562 flags.go:64] FLAG: --port="10250" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916371 2562 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916374 2562 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c305858a83a2f92b" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916378 2562 flags.go:64] FLAG: --qos-reserved="" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916381 2562 flags.go:64] FLAG: --read-only-port="10255" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916384 2562 flags.go:64] FLAG: --register-node="true" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916387 2562 flags.go:64] FLAG: --register-schedulable="true" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916390 2562 flags.go:64] FLAG: --register-with-taints="" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916394 2562 flags.go:64] FLAG: --registry-burst="10" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916396 2562 flags.go:64] FLAG: --registry-qps="5" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916399 2562 flags.go:64] FLAG: --reserved-cpus="" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916402 2562 flags.go:64] FLAG: --reserved-memory="" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916405 2562 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916408 2562 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916411 2562 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916415 2562 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916418 2562 flags.go:64] FLAG: --runonce="false" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916420 2562 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916424 2562 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916427 2562 flags.go:64] FLAG: --seccomp-default="false" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916430 2562 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916433 2562 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916437 2562 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 16:39:07.922040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916440 2562 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916443 2562 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916446 2562 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916449 2562 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916452 2562 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916456 2562 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916459 2562 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916462 2562 flags.go:64] FLAG: --system-cgroups="" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916465 2562 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916471 2562 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916474 2562 flags.go:64] FLAG: --tls-cert-file="" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916477 2562 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916481 2562 flags.go:64] FLAG: --tls-min-version="" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916484 2562 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916486 2562 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916489 2562 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916492 2562 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916496 2562 flags.go:64] FLAG: --v="2" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916500 2562 flags.go:64] FLAG: --version="false" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916504 2562 flags.go:64] FLAG: --vmodule="" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916508 2562 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.916511 2562 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916606 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916611 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:07.922661 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916615 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916620 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916623 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916626 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916629 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916631 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916634 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916636 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916639 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916642 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916644 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916647 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916650 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916653 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916656 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916659 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916662 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916664 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916667 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916670 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:07.923274 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916696 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916699 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916703 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916706 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916710 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916715 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916718 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916722 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916724 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916727 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916730 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916733 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916735 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916738 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916741 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916743 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916746 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916749 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916751 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:07.923801 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916754 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916756 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916778 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916781 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916783 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916786 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916790 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916793 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916796 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916799 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916801 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916804 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916807 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916810 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916812 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916815 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916818 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916820 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916824 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:07.924287 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916827 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916830 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916832 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916835 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916837 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916840 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916843 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916845 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916848 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916850 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916853 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916855 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916858 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916860 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916863 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916865 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916868 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916871 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916874 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916878 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:07.924735 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916880 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:07.925242 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916883 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:07.925242 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916885 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:07.925242 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916888 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:07.925242 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916891 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:07.925242 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.916893 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:07.925242 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.917907 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:39:07.925242 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.924889 2562 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 16:39:07.925242 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.924905 2562 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 16:39:07.925242 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.924971 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:07.925242 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.924977 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:07.925242 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.924980 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:07.925242 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.924984 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:07.925242 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.924987 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:07.925242 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.924990 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:07.925242 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.924993 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.924996 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.924999 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925002 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925004 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925007 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925010 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925012 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925015 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925018 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925020 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925023 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925026 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925029 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925031 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925034 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925036 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925039 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925041 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925044 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:07.925629 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925046 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925049 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925051 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925054 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925056 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925060 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925063 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925066 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925069 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925072 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925075 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925078 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925080 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925083 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925085 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925088 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925091 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925093 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925097 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925102 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:07.926130 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925105 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925108 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925111 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925113 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925116 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925119 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925122 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925125 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925127 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925130 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925133 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925136 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925138 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925141 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925143 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925146 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925149 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925153 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925157 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925159 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:07.926660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925162 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925165 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925167 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925170 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925172 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925176 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925180 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925183 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925186 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925188 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925191 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925194 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925196 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925199 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925202 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925205 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925207 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925210 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925213 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:07.927168 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925215 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:07.927624 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.925220 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:39:07.927624 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925320 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:07.927624 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925325 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:07.927624 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925328 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:07.927624 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925332 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:07.927624 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925335 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:07.927624 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925337 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:07.927624 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925340 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:07.927624 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925343 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:07.927624 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925346 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:07.927624 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925349 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:07.927624 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925352 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:07.927624 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925355 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:07.927624 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925358 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:07.927624 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925360 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925363 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925366 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925369 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925371 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925374 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925376 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925379 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925382 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925385 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925388 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925392 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925395 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925398 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925401 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925404 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925406 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925409 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925411 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:07.928004 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925414 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925416 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925419 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925421 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925425 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925428 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925430 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925433 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925436 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925439 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925442 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925453 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925458 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925461 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925464 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925467 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925469 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925472 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925476 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925479 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:07.928455 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925483 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925485 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925488 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925491 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925494 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925497 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925500 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925503 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925505 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925508 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925511 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925513 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925516 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925519 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925522 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925525 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925528 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925531 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925533 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:07.928940 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925536 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:07.929395 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925539 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:07.929395 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925541 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:07.929395 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925544 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:07.929395 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925547 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:07.929395 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925550 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:07.929395 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925553 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:07.929395 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925555 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:07.929395 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925558 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:07.929395 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925560 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:07.929395 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925563 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:07.929395 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925565 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:07.929395 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925568 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:07.929395 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925571 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:07.929395 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:07.925573 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:07.929395 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.925578 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:39:07.929395 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.926324 2562 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 16:39:07.930052 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.930039 2562 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 16:39:07.930996 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.930963 2562 server.go:1019] "Starting client certificate rotation" Apr 24 16:39:07.931089 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.931069 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:39:07.931140 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.931124 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:39:07.958291 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.958271 2562 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:39:07.963902 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.963875 2562 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:39:07.982851 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.982834 2562 log.go:25] "Validated CRI v1 runtime API" Apr 24 16:39:07.990390 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.990371 2562 log.go:25] "Validated CRI v1 image API" Apr 24 16:39:07.991677 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.991663 2562 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 16:39:07.992325 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.992308 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:39:07.994914 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.994891 2562 fs.go:135] Filesystem UUIDs: map[28e0e1a9-ac6a-4e63-9f19-c24ddc998961:/dev/nvme0n1p4 524686f9-1a01-48a8-8ca9-81400628fc3c:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 24 16:39:07.994986 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:07.994914 2562 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 16:39:08.002500 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.002389 2562 manager.go:217] Machine: {Timestamp:2026-04-24 16:39:08.000022024 +0000 UTC m=+0.457335907 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099184 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b485ae3357526f289d179eaca8463 SystemUUID:ec2b485a-e335-7526-f289-d179eaca8463 BootID:86792fba-a98c-4e7d-929b-1e8e4a7623ec Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9a:98:4e:88:51 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9a:98:4e:88:51 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:8e:d6:cd:1d:72:21 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 16:39:08.002500 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.002500 2562 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 16:39:08.002599 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.002584 2562 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 16:39:08.005227 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.005201 2562 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 16:39:08.005369 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.005230 2562 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-144.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 16:39:08.005414 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.005379 2562 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 16:39:08.005414 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.005388 2562 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 16:39:08.005414 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.005401 2562 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:39:08.007002 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.006991 2562 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:39:08.008434 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.008418 2562 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:39:08.008727 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.008717 2562 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 16:39:08.011310 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.011300 2562 kubelet.go:491] "Attempting to sync node with API server" Apr 24 16:39:08.011343 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.011320 2562 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 16:39:08.011343 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.011333 2562 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 16:39:08.011343 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.011342 2562 kubelet.go:397] "Adding apiserver pod source" Apr 24 16:39:08.011442 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.011351 2562 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 16:39:08.012404 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.012392 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:39:08.012446 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.012413 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:39:08.015343 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.015328 2562 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 16:39:08.016750 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.016738 2562 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 16:39:08.018779 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.018764 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 16:39:08.018846 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.018782 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 16:39:08.018846 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.018789 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 16:39:08.018846 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.018797 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 16:39:08.018846 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.018803 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 16:39:08.018846 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.018811 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 16:39:08.018846 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.018820 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 16:39:08.018846 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.018826 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 16:39:08.018846 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.018833 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 16:39:08.018846 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.018839 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 16:39:08.019085 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.018864 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 16:39:08.019085 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.018876 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 16:39:08.019085 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.018884 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sxq45" Apr 24 16:39:08.019953 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.019927 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 16:39:08.019991 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.019956 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 16:39:08.024217 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.024201 2562 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 16:39:08.024293 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.024246 2562 server.go:1295] "Started kubelet" Apr 24 16:39:08.024383 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.024321 2562 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 16:39:08.024437 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.024362 2562 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 16:39:08.024437 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.024415 2562 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 16:39:08.024864 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.024838 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 16:39:08.024864 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.024846 2562 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 16:39:08.025053 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.024968 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 16:39:08.025282 ip-10-0-143-144 systemd[1]: Started Kubernetes Kubelet. Apr 24 16:39:08.025409 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.025387 2562 server.go:317] "Adding debug handlers to kubelet server" Apr 24 16:39:08.026007 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.025989 2562 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 16:39:08.028773 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.028753 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sxq45" Apr 24 16:39:08.033343 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.033325 2562 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 16:39:08.033438 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.033343 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 16:39:08.034000 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.033977 2562 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 16:39:08.034103 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.034084 2562 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 16:39:08.034201 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.034030 2562 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 16:39:08.034268 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.034137 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-144.ec2.internal\" not found" Apr 24 16:39:08.034318 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.034142 2562 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 16:39:08.034318 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.034202 2562 reconstruct.go:97] "Volume reconstruction finished" Apr 24 16:39:08.034318 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.034285 2562 reconciler.go:26] "Reconciler: start to sync state" Apr 24 16:39:08.034625 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.034609 2562 factory.go:55] Registering systemd factory Apr 24 16:39:08.034669 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.034629 2562 factory.go:223] Registration of the systemd container factory successfully Apr 24 16:39:08.034830 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.034818 2562 factory.go:153] Registering CRI-O factory Apr 24 16:39:08.034830 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.034830 2562 factory.go:223] Registration of the crio container factory successfully Apr 24 16:39:08.034903 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.034870 2562 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 16:39:08.034903 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.034884 2562 factory.go:103] Registering Raw factory Apr 24 16:39:08.034903 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.034893 2562 manager.go:1196] Started watching for new ooms in manager Apr 24 16:39:08.035550 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.035537 2562 manager.go:319] Starting recovery of all containers Apr 24 16:39:08.035993 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.035974 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:08.038889 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.038808 2562 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-143-144.ec2.internal\" not found" node="ip-10-0-143-144.ec2.internal" Apr 24 16:39:08.041560 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.041531 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 16:39:08.047081 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.046969 2562 manager.go:324] Recovery completed Apr 24 16:39:08.052430 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.052417 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:08.054875 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.054860 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:08.054960 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.054887 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:08.054960 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.054897 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:08.055427 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.055415 2562 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 16:39:08.055473 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.055426 2562 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 16:39:08.055473 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.055442 2562 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:39:08.057805 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.057794 2562 policy_none.go:49] "None policy: Start" Apr 24 16:39:08.057839 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.057809 2562 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 16:39:08.057839 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.057819 2562 state_mem.go:35] "Initializing new in-memory state store" Apr 24 16:39:08.091132 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.090972 2562 manager.go:341] "Starting Device Plugin manager" Apr 24 16:39:08.091132 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.091007 2562 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 16:39:08.091132 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.091020 2562 server.go:85] "Starting device plugin registration server" Apr 24 16:39:08.091333 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.091296 2562 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 16:39:08.091333 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.091310 2562 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 16:39:08.091467 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.091453 2562 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 16:39:08.091540 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.091527 2562 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 16:39:08.091593 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.091540 2562 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 16:39:08.092169 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.092141 2562 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 16:39:08.092254 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.092199 2562 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-144.ec2.internal\" not found" Apr 24 16:39:08.172506 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.172415 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 16:39:08.172506 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.172454 2562 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 16:39:08.172506 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.172480 2562 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 16:39:08.172506 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.172488 2562 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 16:39:08.172770 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.172530 2562 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 16:39:08.175464 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.175448 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:08.191825 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.191808 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:08.192851 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.192831 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:08.192955 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.192865 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:08.192955 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.192875 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:08.192955 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.192900 2562 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-144.ec2.internal" Apr 24 16:39:08.207665 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.207637 2562 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-144.ec2.internal" Apr 24 16:39:08.207665 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.207672 2562 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-144.ec2.internal\": node \"ip-10-0-143-144.ec2.internal\" not found" Apr 24 16:39:08.240774 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.240751 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-144.ec2.internal\" not found" Apr 24 16:39:08.272853 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.272824 2562 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-144.ec2.internal"] Apr 24 16:39:08.272920 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.272905 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:08.273888 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.273871 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:08.273998 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.273902 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:08.273998 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.273915 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:08.275202 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.275189 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:08.275384 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.275369 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal" Apr 24 16:39:08.275437 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.275397 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:08.275922 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.275903 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:08.276028 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.275909 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:08.276028 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.275986 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:08.276028 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.276007 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:08.276165 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.275965 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:08.276165 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.276056 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:08.277252 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.277238 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-144.ec2.internal" Apr 24 16:39:08.277317 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.277262 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:08.277888 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.277872 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:08.277985 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.277897 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:08.277985 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.277908 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:08.304787 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.304761 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-144.ec2.internal\" not found" node="ip-10-0-143-144.ec2.internal" Apr 24 16:39:08.309233 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.309216 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-144.ec2.internal\" not found" node="ip-10-0-143-144.ec2.internal" Apr 24 16:39:08.335883 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.335858 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0475c0131c709c68e4c47751862dac8e-config\") pod \"kube-apiserver-proxy-ip-10-0-143-144.ec2.internal\" (UID: \"0475c0131c709c68e4c47751862dac8e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-144.ec2.internal" Apr 24 16:39:08.336010 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.335885 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/58d73468b484d3d9535d53ba0e0c885d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal\" (UID: \"58d73468b484d3d9535d53ba0e0c885d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal" Apr 24 16:39:08.336010 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.335919 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/58d73468b484d3d9535d53ba0e0c885d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal\" (UID: \"58d73468b484d3d9535d53ba0e0c885d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal" Apr 24 16:39:08.340983 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.340963 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-144.ec2.internal\" not found" Apr 24 16:39:08.436448 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.436359 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0475c0131c709c68e4c47751862dac8e-config\") pod \"kube-apiserver-proxy-ip-10-0-143-144.ec2.internal\" (UID: \"0475c0131c709c68e4c47751862dac8e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-144.ec2.internal" Apr 24 16:39:08.436448 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.436395 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/58d73468b484d3d9535d53ba0e0c885d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal\" (UID: \"58d73468b484d3d9535d53ba0e0c885d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal" Apr 24 16:39:08.436448 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.436428 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/58d73468b484d3d9535d53ba0e0c885d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal\" (UID: \"58d73468b484d3d9535d53ba0e0c885d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal" Apr 24 16:39:08.436643 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.436465 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0475c0131c709c68e4c47751862dac8e-config\") pod \"kube-apiserver-proxy-ip-10-0-143-144.ec2.internal\" (UID: \"0475c0131c709c68e4c47751862dac8e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-144.ec2.internal" Apr 24 16:39:08.436643 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.436475 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/58d73468b484d3d9535d53ba0e0c885d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal\" (UID: \"58d73468b484d3d9535d53ba0e0c885d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal" Apr 24 16:39:08.436643 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.436477 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/58d73468b484d3d9535d53ba0e0c885d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal\" (UID: \"58d73468b484d3d9535d53ba0e0c885d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal" Apr 24 16:39:08.441454 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.441433 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-144.ec2.internal\" not found" Apr 24 16:39:08.542439 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.542402 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-144.ec2.internal\" not found" Apr 24 16:39:08.606622 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.606595 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal" Apr 24 16:39:08.612380 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.612358 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-144.ec2.internal" Apr 24 16:39:08.643197 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.643166 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-144.ec2.internal\" not found" Apr 24 16:39:08.743745 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.743647 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-144.ec2.internal\" not found" Apr 24 16:39:08.844334 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.844302 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-144.ec2.internal\" not found" Apr 24 16:39:08.930535 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.930505 2562 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 16:39:08.931126 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.930650 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 16:39:08.931126 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:08.930680 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 16:39:08.945002 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:08.944979 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-144.ec2.internal\" not found" Apr 24 16:39:09.030636 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:09.030552 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 16:34:08 +0000 UTC" deadline="2027-12-15 01:17:54.834162643 +0000 UTC" Apr 24 16:39:09.030636 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:09.030596 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14384h38m45.803569685s" Apr 24 16:39:09.033754 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:09.033734 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 16:39:09.045075 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:09.045053 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-144.ec2.internal\" not found" Apr 24 16:39:09.047000 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:09.046970 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:39:09.069114 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:09.069084 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-ncvdk" Apr 24 16:39:09.077289 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:09.077261 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-ncvdk" Apr 24 16:39:09.101187 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:09.101166 2562 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:09.134069 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:09.134048 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal" Apr 24 16:39:09.138263 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:09.138235 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58d73468b484d3d9535d53ba0e0c885d.slice/crio-ebb251ccb1e3cf890d3337364debe7971ef07db3c0aabf9b7827cc489647730c WatchSource:0}: Error finding container ebb251ccb1e3cf890d3337364debe7971ef07db3c0aabf9b7827cc489647730c: Status 404 returned error can't find the container with id ebb251ccb1e3cf890d3337364debe7971ef07db3c0aabf9b7827cc489647730c Apr 24 16:39:09.138516 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:09.138496 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0475c0131c709c68e4c47751862dac8e.slice/crio-378bca629dc71273436e4b202f48cf5ce8ad86dd17b974559277ca00a0ce13d2 WatchSource:0}: Error finding container 378bca629dc71273436e4b202f48cf5ce8ad86dd17b974559277ca00a0ce13d2: Status 404 returned error can't find the container with id 378bca629dc71273436e4b202f48cf5ce8ad86dd17b974559277ca00a0ce13d2 Apr 24 16:39:09.144665 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:09.144649 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:39:09.146619 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:09.146601 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:39:09.148553 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:09.148538 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-144.ec2.internal" Apr 24 16:39:09.155164 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:09.155146 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:39:09.170670 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:09.170647 2562 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:09.175328 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:09.175288 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-144.ec2.internal" event={"ID":"0475c0131c709c68e4c47751862dac8e","Type":"ContainerStarted","Data":"378bca629dc71273436e4b202f48cf5ce8ad86dd17b974559277ca00a0ce13d2"} Apr 24 16:39:09.176212 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:09.176191 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal" event={"ID":"58d73468b484d3d9535d53ba0e0c885d","Type":"ContainerStarted","Data":"ebb251ccb1e3cf890d3337364debe7971ef07db3c0aabf9b7827cc489647730c"} Apr 24 16:39:09.921671 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:09.921637 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:10.014742 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.014704 2562 apiserver.go:52] "Watching apiserver" Apr 24 16:39:10.022378 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.022355 2562 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 16:39:10.025349 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.025323 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-w297f","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal","openshift-multus/multus-vlffk","openshift-network-diagnostics/network-check-target-j4684","kube-system/konnectivity-agent-t5dcg","kube-system/kube-apiserver-proxy-ip-10-0-143-144.ec2.internal","openshift-multus/multus-additional-cni-plugins-xjmsw","openshift-multus/network-metrics-daemon-wp5zn","openshift-network-operator/iptables-alerter-xlhds","openshift-ovn-kubernetes/ovnkube-node-gtxl5","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7","openshift-cluster-node-tuning-operator/tuned-xkqhq","openshift-dns/node-resolver-m9vmw"] Apr 24 16:39:10.027797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.027770 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.027797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.027791 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.029034 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.029014 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:10.029136 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:10.029085 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j4684" podUID="6235daff-03fe-4662-a803-e1884c643b19" Apr 24 16:39:10.030216 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.030196 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-t5dcg" Apr 24 16:39:10.030383 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.030363 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 16:39:10.030547 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.030529 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 16:39:10.030609 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.030565 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 16:39:10.030670 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.030652 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-crgbx\"" Apr 24 16:39:10.030718 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.030686 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 16:39:10.030819 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.030803 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 16:39:10.031350 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.031333 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 16:39:10.031525 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.031509 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-bbbr5\"" Apr 24 16:39:10.032266 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.032130 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 16:39:10.032566 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.032544 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 16:39:10.032759 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.032741 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-w297f" Apr 24 16:39:10.033056 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.033035 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-lcsbh\"" Apr 24 16:39:10.034135 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.034004 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:10.034135 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.034024 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xlhds" Apr 24 16:39:10.034135 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:10.034070 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wp5zn" podUID="93f47728-cffc-4da9-9791-92c3d70ac2d2" Apr 24 16:39:10.035134 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.035046 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 16:39:10.035470 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.035447 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 16:39:10.035559 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.035460 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-mlsxz\"" Apr 24 16:39:10.035559 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.035482 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 16:39:10.035967 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.035921 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.036846 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.036822 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:39:10.037006 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.036989 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 16:39:10.037221 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.037181 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 16:39:10.037308 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.037268 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9h4fl\"" Apr 24 16:39:10.038626 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.038363 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 16:39:10.038626 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.038397 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-7wgt2\"" Apr 24 16:39:10.038626 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.038412 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 16:39:10.038626 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.038432 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.038626 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.038435 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 16:39:10.038626 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.038373 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 16:39:10.040198 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.039973 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 16:39:10.040304 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.040288 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.040365 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.040343 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 16:39:10.043296 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.042327 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 16:39:10.043296 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.042631 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 16:39:10.043296 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.042894 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m9vmw" Apr 24 16:39:10.043296 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.043030 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8jfrg\"" Apr 24 16:39:10.043296 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.043036 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 16:39:10.043597 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.043454 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 16:39:10.043597 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.043542 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:39:10.043989 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.043816 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-z9jzk\"" Apr 24 16:39:10.045177 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.045160 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 16:39:10.045428 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.045407 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 16:39:10.045511 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.045467 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wwhfn\"" Apr 24 16:39:10.046581 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.046553 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-cni-binary-copy\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.046675 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.046590 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-cni-binary-copy\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.046675 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.046617 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e5357c94-3bc4-4825-808b-68599eb79e96-host\") pod \"node-ca-w297f\" (UID: \"e5357c94-3bc4-4825-808b-68599eb79e96\") " pod="openshift-image-registry/node-ca-w297f" Apr 24 16:39:10.046675 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.046642 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-run-netns\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.046675 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.046669 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dlj6\" (UniqueName: \"kubernetes.io/projected/93f47728-cffc-4da9-9791-92c3d70ac2d2-kube-api-access-5dlj6\") pod \"network-metrics-daemon-wp5zn\" (UID: \"93f47728-cffc-4da9-9791-92c3d70ac2d2\") " pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:10.046875 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.046693 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.046875 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.046744 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwxkk\" (UniqueName: \"kubernetes.io/projected/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-kube-api-access-kwxkk\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.046875 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.046774 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84m58\" (UniqueName: \"kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58\") pod \"network-check-target-j4684\" (UID: \"6235daff-03fe-4662-a803-e1884c643b19\") " pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:10.046875 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.046801 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-var-lib-openvswitch\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.046875 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.046829 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-cni-netd\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.046875 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.046853 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-multus-conf-dir\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.047291 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.046876 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-multus-daemon-config\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.047291 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.046918 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-slash\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.047291 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.046971 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-system-cni-dir\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.047291 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047002 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-host-run-multus-certs\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.047291 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047041 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-etc-openvswitch\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.047291 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047074 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-host-var-lib-cni-multus\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.047291 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047105 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t88mf\" (UniqueName: \"kubernetes.io/projected/e5357c94-3bc4-4825-808b-68599eb79e96-kube-api-access-t88mf\") pod \"node-ca-w297f\" (UID: \"e5357c94-3bc4-4825-808b-68599eb79e96\") " pod="openshift-image-registry/node-ca-w297f" Apr 24 16:39:10.047291 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047137 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-run-systemd\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.047291 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047153 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-host-run-k8s-cni-cncf-io\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.047291 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047168 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-host-var-lib-cni-bin\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.047291 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047182 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-hostroot\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.047291 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047212 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/863af5bc-e499-4451-82ca-2c9265a5df62-iptables-alerter-script\") pod \"iptables-alerter-xlhds\" (UID: \"863af5bc-e499-4451-82ca-2c9265a5df62\") " pod="openshift-network-operator/iptables-alerter-xlhds" Apr 24 16:39:10.047291 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047226 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-os-release\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.047291 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047252 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-node-log\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.047291 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047274 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.047291 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047296 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-cni-bin\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.047996 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047326 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p9wk\" (UniqueName: \"kubernetes.io/projected/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-kube-api-access-9p9wk\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.047996 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047366 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-etc-kubernetes\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.047996 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047514 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6db26d97-eaa5-4cf6-9b9c-a4d322db5952-agent-certs\") pod \"konnectivity-agent-t5dcg\" (UID: \"6db26d97-eaa5-4cf6-9b9c-a4d322db5952\") " pod="kube-system/konnectivity-agent-t5dcg" Apr 24 16:39:10.047996 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047558 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6db26d97-eaa5-4cf6-9b9c-a4d322db5952-konnectivity-ca\") pod \"konnectivity-agent-t5dcg\" (UID: \"6db26d97-eaa5-4cf6-9b9c-a4d322db5952\") " pod="kube-system/konnectivity-agent-t5dcg" Apr 24 16:39:10.047996 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047593 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-system-cni-dir\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.047996 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047622 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-multus-socket-dir-parent\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.047996 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047645 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.047996 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047664 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-multus-cni-dir\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.047996 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047678 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-host-var-lib-kubelet\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.047996 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047693 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vht9w\" (UniqueName: \"kubernetes.io/projected/863af5bc-e499-4451-82ca-2c9265a5df62-kube-api-access-vht9w\") pod \"iptables-alerter-xlhds\" (UID: \"863af5bc-e499-4451-82ca-2c9265a5df62\") " pod="openshift-network-operator/iptables-alerter-xlhds" Apr 24 16:39:10.047996 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047712 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-systemd-units\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.047996 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047727 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-host-run-netns\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.047996 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047744 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-kubelet\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.047996 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047798 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvv2c\" (UniqueName: \"kubernetes.io/projected/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-kube-api-access-dvv2c\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.047996 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047835 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-os-release\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.047996 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047864 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.048594 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047889 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-ovnkube-config\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.048594 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047916 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-cnibin\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.048594 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.047986 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/863af5bc-e499-4451-82ca-2c9265a5df62-host-slash\") pod \"iptables-alerter-xlhds\" (UID: \"863af5bc-e499-4451-82ca-2c9265a5df62\") " pod="openshift-network-operator/iptables-alerter-xlhds" Apr 24 16:39:10.048594 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.048011 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e5357c94-3bc4-4825-808b-68599eb79e96-serviceca\") pod \"node-ca-w297f\" (UID: \"e5357c94-3bc4-4825-808b-68599eb79e96\") " pod="openshift-image-registry/node-ca-w297f" Apr 24 16:39:10.048594 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.048044 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-run-openvswitch\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.048594 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.048067 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-run-ovn\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.048594 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.048088 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-log-socket\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.048594 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.048110 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-ovnkube-script-lib\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.048594 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.048132 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-cnibin\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.048594 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.048153 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs\") pod \"network-metrics-daemon-wp5zn\" (UID: \"93f47728-cffc-4da9-9791-92c3d70ac2d2\") " pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:10.048594 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.048189 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.048594 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.048230 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-env-overrides\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.048594 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.048281 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-ovn-node-metrics-cert\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.078516 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.078485 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:34:09 +0000 UTC" deadline="2027-12-06 11:34:42.618028015 +0000 UTC" Apr 24 16:39:10.078516 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.078516 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14178h55m32.539515549s" Apr 24 16:39:10.135144 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.135118 2562 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 16:39:10.149147 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149121 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvv2c\" (UniqueName: \"kubernetes.io/projected/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-kube-api-access-dvv2c\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.149147 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149151 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-os-release\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.149346 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149174 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.149346 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149201 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-ovnkube-config\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.149346 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149250 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-cnibin\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.149346 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149292 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-os-release\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.149346 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149307 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-cnibin\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.149346 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149335 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/863af5bc-e499-4451-82ca-2c9265a5df62-host-slash\") pod \"iptables-alerter-xlhds\" (UID: \"863af5bc-e499-4451-82ca-2c9265a5df62\") " pod="openshift-network-operator/iptables-alerter-xlhds" Apr 24 16:39:10.149622 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149364 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e5357c94-3bc4-4825-808b-68599eb79e96-serviceca\") pod \"node-ca-w297f\" (UID: \"e5357c94-3bc4-4825-808b-68599eb79e96\") " pod="openshift-image-registry/node-ca-w297f" Apr 24 16:39:10.149622 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149388 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-run-openvswitch\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.149622 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149415 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-run-ovn\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.149622 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149430 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.149622 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149438 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-log-socket\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.149622 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149473 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-ovnkube-script-lib\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.149622 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149497 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-cnibin\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.149622 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149498 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/863af5bc-e499-4451-82ca-2c9265a5df62-host-slash\") pod \"iptables-alerter-xlhds\" (UID: \"863af5bc-e499-4451-82ca-2c9265a5df62\") " pod="openshift-network-operator/iptables-alerter-xlhds" Apr 24 16:39:10.149622 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149522 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs\") pod \"network-metrics-daemon-wp5zn\" (UID: \"93f47728-cffc-4da9-9791-92c3d70ac2d2\") " pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:10.149622 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149561 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-log-socket\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.149622 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149560 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.149622 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149623 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-kubernetes\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.150235 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149652 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-env-overrides\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.150235 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149678 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-ovn-node-metrics-cert\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.150235 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149709 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-cni-binary-copy\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.150235 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149735 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-cni-binary-copy\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.150235 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149761 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e5357c94-3bc4-4825-808b-68599eb79e96-host\") pod \"node-ca-w297f\" (UID: \"e5357c94-3bc4-4825-808b-68599eb79e96\") " pod="openshift-image-registry/node-ca-w297f" Apr 24 16:39:10.150235 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149797 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-ovnkube-config\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.150235 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149790 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-run-netns\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.150235 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149839 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-sysctl-d\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.150235 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149875 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-run-ovn\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.150235 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149875 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-systemd\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.150235 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149877 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e5357c94-3bc4-4825-808b-68599eb79e96-serviceca\") pod \"node-ca-w297f\" (UID: \"e5357c94-3bc4-4825-808b-68599eb79e96\") " pod="openshift-image-registry/node-ca-w297f" Apr 24 16:39:10.150235 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149911 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-lib-modules\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.150235 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149926 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-cnibin\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.150235 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.149964 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dlj6\" (UniqueName: \"kubernetes.io/projected/93f47728-cffc-4da9-9791-92c3d70ac2d2-kube-api-access-5dlj6\") pod \"network-metrics-daemon-wp5zn\" (UID: \"93f47728-cffc-4da9-9791-92c3d70ac2d2\") " pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:10.150235 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:10.150044 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:10.150235 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.150044 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.150235 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.150075 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwxkk\" (UniqueName: \"kubernetes.io/projected/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-kube-api-access-kwxkk\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.150980 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:10.150130 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs podName:93f47728-cffc-4da9-9791-92c3d70ac2d2 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:10.650082071 +0000 UTC m=+3.107395920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs") pod "network-metrics-daemon-wp5zn" (UID: "93f47728-cffc-4da9-9791-92c3d70ac2d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:10.150980 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.150161 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.150980 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.150227 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-run-openvswitch\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.150980 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.150264 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e5357c94-3bc4-4825-808b-68599eb79e96-host\") pod \"node-ca-w297f\" (UID: \"e5357c94-3bc4-4825-808b-68599eb79e96\") " pod="openshift-image-registry/node-ca-w297f" Apr 24 16:39:10.150980 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.150301 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84m58\" (UniqueName: \"kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58\") pod \"network-check-target-j4684\" (UID: \"6235daff-03fe-4662-a803-e1884c643b19\") " pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:10.150980 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.150358 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-var-lib-openvswitch\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.150980 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.150338 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-run-netns\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.151215 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.150977 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-cni-netd\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.151215 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.151170 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-var-lib-openvswitch\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.151215 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.151157 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-ovnkube-script-lib\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.151309 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.151265 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-cni-netd\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.151309 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.151280 2562 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 16:39:10.151309 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.151300 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/113f984b-f618-43f6-b9a2-3a6b1e63676b-tmp\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.151902 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.151715 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-env-overrides\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.151902 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.151736 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.152107 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.152024 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-cni-binary-copy\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.152236 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.152217 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-multus-conf-dir\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.152440 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.152414 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-multus-daemon-config\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.152539 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.152483 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-slash\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.152539 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.152496 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-cni-binary-copy\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.152539 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.152519 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-multus-conf-dir\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.152691 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.152539 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/301a467d-8583-49ee-aea7-425e83b7c4bf-sys-fs\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.152691 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.152559 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-slash\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.152691 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.152669 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-system-cni-dir\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.152817 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.152709 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-host-run-multus-certs\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.152817 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.152794 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-etc-openvswitch\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.152905 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.152863 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9a9da436-211c-45bd-9f7b-51e5eea9f69e-hosts-file\") pod \"node-resolver-m9vmw\" (UID: \"9a9da436-211c-45bd-9f7b-51e5eea9f69e\") " pod="openshift-dns/node-resolver-m9vmw" Apr 24 16:39:10.152975 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.152917 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-host-var-lib-cni-multus\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.152975 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.152966 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t88mf\" (UniqueName: \"kubernetes.io/projected/e5357c94-3bc4-4825-808b-68599eb79e96-kube-api-access-t88mf\") pod \"node-ca-w297f\" (UID: \"e5357c94-3bc4-4825-808b-68599eb79e96\") " pod="openshift-image-registry/node-ca-w297f" Apr 24 16:39:10.153061 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153034 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-etc-openvswitch\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.153061 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153047 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-host-run-multus-certs\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.153160 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153057 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-host-var-lib-cni-multus\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.153320 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153300 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-run-systemd\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.153408 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153369 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-system-cni-dir\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.153470 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153437 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/301a467d-8583-49ee-aea7-425e83b7c4bf-socket-dir\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.153527 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153481 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-sys\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.153579 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153531 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-host-run-k8s-cni-cncf-io\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.153579 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153568 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-host-var-lib-cni-bin\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.153674 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153598 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-hostroot\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.153674 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153633 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/863af5bc-e499-4451-82ca-2c9265a5df62-iptables-alerter-script\") pod \"iptables-alerter-xlhds\" (UID: \"863af5bc-e499-4451-82ca-2c9265a5df62\") " pod="openshift-network-operator/iptables-alerter-xlhds" Apr 24 16:39:10.153775 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153742 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/301a467d-8583-49ee-aea7-425e83b7c4bf-registration-dir\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.153827 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153778 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/301a467d-8583-49ee-aea7-425e83b7c4bf-device-dir\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.153827 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153811 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-sysconfig\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.153919 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153837 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-sysctl-conf\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.153919 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153872 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-os-release\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.153919 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153906 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-node-log\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.154085 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153958 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.154085 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.153995 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-cni-bin\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.154185 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154099 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9p9wk\" (UniqueName: \"kubernetes.io/projected/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-kube-api-access-9p9wk\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.154185 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154135 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-modprobe-d\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.154185 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154164 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9a9da436-211c-45bd-9f7b-51e5eea9f69e-tmp-dir\") pod \"node-resolver-m9vmw\" (UID: \"9a9da436-211c-45bd-9f7b-51e5eea9f69e\") " pod="openshift-dns/node-resolver-m9vmw" Apr 24 16:39:10.154323 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154197 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-etc-kubernetes\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.154323 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154233 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6db26d97-eaa5-4cf6-9b9c-a4d322db5952-agent-certs\") pod \"konnectivity-agent-t5dcg\" (UID: \"6db26d97-eaa5-4cf6-9b9c-a4d322db5952\") " pod="kube-system/konnectivity-agent-t5dcg" Apr 24 16:39:10.154323 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154266 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6db26d97-eaa5-4cf6-9b9c-a4d322db5952-konnectivity-ca\") pod \"konnectivity-agent-t5dcg\" (UID: \"6db26d97-eaa5-4cf6-9b9c-a4d322db5952\") " pod="kube-system/konnectivity-agent-t5dcg" Apr 24 16:39:10.154323 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154299 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-tuned\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.154511 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154329 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-system-cni-dir\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.154511 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154373 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/301a467d-8583-49ee-aea7-425e83b7c4bf-etc-selinux\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.154511 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154400 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-run\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.154511 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154432 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-host\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.154511 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154468 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-multus-socket-dir-parent\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.154511 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154505 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.156244 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154541 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-node-log\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.156244 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154567 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.156244 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154586 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-system-cni-dir\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.156244 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154593 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-host-var-lib-cni-bin\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.156244 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154640 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-hostroot\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.156244 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154651 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-cni-bin\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.156244 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154707 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-run-systemd\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.156244 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154736 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s5d8\" (UniqueName: \"kubernetes.io/projected/113f984b-f618-43f6-b9a2-3a6b1e63676b-kube-api-access-9s5d8\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.156244 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154747 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-host-run-k8s-cni-cncf-io\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.156244 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154780 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.156244 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154798 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-multus-cni-dir\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.156244 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154837 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-host-var-lib-kubelet\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.156244 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154878 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-os-release\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.156244 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154891 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-multus-socket-dir-parent\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.156244 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154892 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-multus-cni-dir\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.156244 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154986 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-etc-kubernetes\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.156244 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.154988 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vht9w\" (UniqueName: \"kubernetes.io/projected/863af5bc-e499-4451-82ca-2c9265a5df62-kube-api-access-vht9w\") pod \"iptables-alerter-xlhds\" (UID: \"863af5bc-e499-4451-82ca-2c9265a5df62\") " pod="openshift-network-operator/iptables-alerter-xlhds" Apr 24 16:39:10.156244 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.155060 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-ovn-node-metrics-cert\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.157061 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.155046 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-systemd-units\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.157061 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.155099 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-systemd-units\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.157061 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.155143 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74465\" (UniqueName: \"kubernetes.io/projected/301a467d-8583-49ee-aea7-425e83b7c4bf-kube-api-access-74465\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.157061 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.155165 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-host-var-lib-kubelet\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.157061 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.155231 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fxd6\" (UniqueName: \"kubernetes.io/projected/9a9da436-211c-45bd-9f7b-51e5eea9f69e-kube-api-access-8fxd6\") pod \"node-resolver-m9vmw\" (UID: \"9a9da436-211c-45bd-9f7b-51e5eea9f69e\") " pod="openshift-dns/node-resolver-m9vmw" Apr 24 16:39:10.157061 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.155383 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-host-run-netns\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.157061 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.155450 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-kubelet\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.157061 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.155480 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/301a467d-8583-49ee-aea7-425e83b7c4bf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.157061 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.155546 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-var-lib-kubelet\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.157061 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.155610 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/863af5bc-e499-4451-82ca-2c9265a5df62-iptables-alerter-script\") pod \"iptables-alerter-xlhds\" (UID: \"863af5bc-e499-4451-82ca-2c9265a5df62\") " pod="openshift-network-operator/iptables-alerter-xlhds" Apr 24 16:39:10.157061 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.155654 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-host-run-netns\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.157061 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.155698 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-host-kubelet\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.157061 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.155795 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-multus-daemon-config\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.157926 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.157906 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6db26d97-eaa5-4cf6-9b9c-a4d322db5952-agent-certs\") pod \"konnectivity-agent-t5dcg\" (UID: \"6db26d97-eaa5-4cf6-9b9c-a4d322db5952\") " pod="kube-system/konnectivity-agent-t5dcg" Apr 24 16:39:10.159257 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:10.159238 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:10.159347 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:10.159261 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:10.159347 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:10.159271 2562 projected.go:194] Error preparing data for projected volume kube-api-access-84m58 for pod openshift-network-diagnostics/network-check-target-j4684: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:10.159347 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:10.159324 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58 podName:6235daff-03fe-4662-a803-e1884c643b19 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:10.659309806 +0000 UTC m=+3.116623655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-84m58" (UniqueName: "kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58") pod "network-check-target-j4684" (UID: "6235daff-03fe-4662-a803-e1884c643b19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:10.160788 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.160761 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvv2c\" (UniqueName: \"kubernetes.io/projected/838b0ff1-a4b5-4f05-9901-2291fd5a85a4-kube-api-access-dvv2c\") pod \"multus-vlffk\" (UID: \"838b0ff1-a4b5-4f05-9901-2291fd5a85a4\") " pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.161993 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.161969 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t88mf\" (UniqueName: \"kubernetes.io/projected/e5357c94-3bc4-4825-808b-68599eb79e96-kube-api-access-t88mf\") pod \"node-ca-w297f\" (UID: \"e5357c94-3bc4-4825-808b-68599eb79e96\") " pod="openshift-image-registry/node-ca-w297f" Apr 24 16:39:10.161993 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.161980 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwxkk\" (UniqueName: \"kubernetes.io/projected/8f1e88cc-ea55-4772-9908-69b5f02b2d4d-kube-api-access-kwxkk\") pod \"multus-additional-cni-plugins-xjmsw\" (UID: \"8f1e88cc-ea55-4772-9908-69b5f02b2d4d\") " pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.162447 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.162429 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dlj6\" (UniqueName: \"kubernetes.io/projected/93f47728-cffc-4da9-9791-92c3d70ac2d2-kube-api-access-5dlj6\") pod \"network-metrics-daemon-wp5zn\" (UID: \"93f47728-cffc-4da9-9791-92c3d70ac2d2\") " pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:10.162530 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.162463 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6db26d97-eaa5-4cf6-9b9c-a4d322db5952-konnectivity-ca\") pod \"konnectivity-agent-t5dcg\" (UID: \"6db26d97-eaa5-4cf6-9b9c-a4d322db5952\") " pod="kube-system/konnectivity-agent-t5dcg" Apr 24 16:39:10.162965 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.162946 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p9wk\" (UniqueName: \"kubernetes.io/projected/c22d7668-64ef-42e4-ac7b-d8eb07a69e1f-kube-api-access-9p9wk\") pod \"ovnkube-node-gtxl5\" (UID: \"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.163123 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.163105 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vht9w\" (UniqueName: \"kubernetes.io/projected/863af5bc-e499-4451-82ca-2c9265a5df62-kube-api-access-vht9w\") pod \"iptables-alerter-xlhds\" (UID: \"863af5bc-e499-4451-82ca-2c9265a5df62\") " pod="openshift-network-operator/iptables-alerter-xlhds" Apr 24 16:39:10.256263 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256174 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9a9da436-211c-45bd-9f7b-51e5eea9f69e-hosts-file\") pod \"node-resolver-m9vmw\" (UID: \"9a9da436-211c-45bd-9f7b-51e5eea9f69e\") " pod="openshift-dns/node-resolver-m9vmw" Apr 24 16:39:10.256263 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256229 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/301a467d-8583-49ee-aea7-425e83b7c4bf-socket-dir\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.256477 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256373 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-sys\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.256477 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256422 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-sys\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.256477 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256433 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/301a467d-8583-49ee-aea7-425e83b7c4bf-registration-dir\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.256477 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256379 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9a9da436-211c-45bd-9f7b-51e5eea9f69e-hosts-file\") pod \"node-resolver-m9vmw\" (UID: \"9a9da436-211c-45bd-9f7b-51e5eea9f69e\") " pod="openshift-dns/node-resolver-m9vmw" Apr 24 16:39:10.256477 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256464 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/301a467d-8583-49ee-aea7-425e83b7c4bf-device-dir\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.256706 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256494 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-sysconfig\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.256706 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256512 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/301a467d-8583-49ee-aea7-425e83b7c4bf-registration-dir\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.256706 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256522 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-sysctl-conf\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.256706 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256556 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-modprobe-d\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.256706 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256563 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/301a467d-8583-49ee-aea7-425e83b7c4bf-device-dir\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.256706 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256583 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9a9da436-211c-45bd-9f7b-51e5eea9f69e-tmp-dir\") pod \"node-resolver-m9vmw\" (UID: \"9a9da436-211c-45bd-9f7b-51e5eea9f69e\") " pod="openshift-dns/node-resolver-m9vmw" Apr 24 16:39:10.256706 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256612 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-tuned\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.256706 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256655 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/301a467d-8583-49ee-aea7-425e83b7c4bf-etc-selinux\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.256706 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256691 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-run\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.256706 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256695 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-sysctl-conf\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.257173 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256716 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-host\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.257173 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256368 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/301a467d-8583-49ee-aea7-425e83b7c4bf-socket-dir\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.257173 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256743 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9s5d8\" (UniqueName: \"kubernetes.io/projected/113f984b-f618-43f6-b9a2-3a6b1e63676b-kube-api-access-9s5d8\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.257173 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256783 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74465\" (UniqueName: \"kubernetes.io/projected/301a467d-8583-49ee-aea7-425e83b7c4bf-kube-api-access-74465\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.257173 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256802 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-run\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.257173 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256809 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fxd6\" (UniqueName: \"kubernetes.io/projected/9a9da436-211c-45bd-9f7b-51e5eea9f69e-kube-api-access-8fxd6\") pod \"node-resolver-m9vmw\" (UID: \"9a9da436-211c-45bd-9f7b-51e5eea9f69e\") " pod="openshift-dns/node-resolver-m9vmw" Apr 24 16:39:10.257173 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256832 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-host\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.257173 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256860 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/301a467d-8583-49ee-aea7-425e83b7c4bf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.257173 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256869 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-modprobe-d\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.257173 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256876 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/301a467d-8583-49ee-aea7-425e83b7c4bf-etc-selinux\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.257173 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256742 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-sysconfig\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.257173 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256903 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-var-lib-kubelet\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.257173 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.256974 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-kubernetes\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.257173 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.257007 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-sysctl-d\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.257173 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.257022 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/301a467d-8583-49ee-aea7-425e83b7c4bf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.257173 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.257031 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-systemd\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.257173 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.257075 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-lib-modules\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.258023 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.257074 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-systemd\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.258023 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.257110 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-kubernetes\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.258023 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.257124 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/113f984b-f618-43f6-b9a2-3a6b1e63676b-tmp\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.258023 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.257142 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9a9da436-211c-45bd-9f7b-51e5eea9f69e-tmp-dir\") pod \"node-resolver-m9vmw\" (UID: \"9a9da436-211c-45bd-9f7b-51e5eea9f69e\") " pod="openshift-dns/node-resolver-m9vmw" Apr 24 16:39:10.258023 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.257151 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-var-lib-kubelet\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.258023 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.257154 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/301a467d-8583-49ee-aea7-425e83b7c4bf-sys-fs\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.258023 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.257188 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-sysctl-d\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.258023 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.257238 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/113f984b-f618-43f6-b9a2-3a6b1e63676b-lib-modules\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.258023 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.257251 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/301a467d-8583-49ee-aea7-425e83b7c4bf-sys-fs\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.259330 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.259305 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/113f984b-f618-43f6-b9a2-3a6b1e63676b-etc-tuned\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.259637 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.259617 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/113f984b-f618-43f6-b9a2-3a6b1e63676b-tmp\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.265296 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.265270 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fxd6\" (UniqueName: \"kubernetes.io/projected/9a9da436-211c-45bd-9f7b-51e5eea9f69e-kube-api-access-8fxd6\") pod \"node-resolver-m9vmw\" (UID: \"9a9da436-211c-45bd-9f7b-51e5eea9f69e\") " pod="openshift-dns/node-resolver-m9vmw" Apr 24 16:39:10.265771 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.265751 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74465\" (UniqueName: \"kubernetes.io/projected/301a467d-8583-49ee-aea7-425e83b7c4bf-kube-api-access-74465\") pod \"aws-ebs-csi-driver-node-vdgj7\" (UID: \"301a467d-8583-49ee-aea7-425e83b7c4bf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.265771 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.265761 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s5d8\" (UniqueName: \"kubernetes.io/projected/113f984b-f618-43f6-b9a2-3a6b1e63676b-kube-api-access-9s5d8\") pod \"tuned-xkqhq\" (UID: \"113f984b-f618-43f6-b9a2-3a6b1e63676b\") " pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.284990 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.284961 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:10.341201 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.341169 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xjmsw" Apr 24 16:39:10.351026 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.351001 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vlffk" Apr 24 16:39:10.357665 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.357642 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-t5dcg" Apr 24 16:39:10.362236 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.362214 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-w297f" Apr 24 16:39:10.370061 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.370045 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xlhds" Apr 24 16:39:10.376580 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.376563 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:10.384146 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.384125 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" Apr 24 16:39:10.391673 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.391649 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" Apr 24 16:39:10.396213 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.396192 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m9vmw" Apr 24 16:39:10.659994 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.659901 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs\") pod \"network-metrics-daemon-wp5zn\" (UID: \"93f47728-cffc-4da9-9791-92c3d70ac2d2\") " pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:10.659994 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:10.659960 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84m58\" (UniqueName: \"kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58\") pod \"network-check-target-j4684\" (UID: \"6235daff-03fe-4662-a803-e1884c643b19\") " pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:10.660195 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:10.660108 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:10.660195 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:10.660134 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:10.660195 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:10.660143 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:10.660359 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:10.660226 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs podName:93f47728-cffc-4da9-9791-92c3d70ac2d2 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:11.660204985 +0000 UTC m=+4.117518859 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs") pod "network-metrics-daemon-wp5zn" (UID: "93f47728-cffc-4da9-9791-92c3d70ac2d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:10.660359 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:10.660147 2562 projected.go:194] Error preparing data for projected volume kube-api-access-84m58 for pod openshift-network-diagnostics/network-check-target-j4684: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:10.660359 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:10.660314 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58 podName:6235daff-03fe-4662-a803-e1884c643b19 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:11.660300908 +0000 UTC m=+4.117614756 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-84m58" (UniqueName: "kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58") pod "network-check-target-j4684" (UID: "6235daff-03fe-4662-a803-e1884c643b19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:10.722971 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:10.722920 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod301a467d_8583_49ee_aea7_425e83b7c4bf.slice/crio-d7e10cc3dd279c4b939834c277d3d672078a47ce69b7a6551e0e4008228b8c47 WatchSource:0}: Error finding container d7e10cc3dd279c4b939834c277d3d672078a47ce69b7a6551e0e4008228b8c47: Status 404 returned error can't find the container with id d7e10cc3dd279c4b939834c277d3d672078a47ce69b7a6551e0e4008228b8c47 Apr 24 16:39:10.723947 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:10.723909 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod838b0ff1_a4b5_4f05_9901_2291fd5a85a4.slice/crio-ff034f7235ba881bd7b0441ddc3e2ee55fff1abf4e4aaec40e9474ef1bfd6bb0 WatchSource:0}: Error finding container ff034f7235ba881bd7b0441ddc3e2ee55fff1abf4e4aaec40e9474ef1bfd6bb0: Status 404 returned error can't find the container with id ff034f7235ba881bd7b0441ddc3e2ee55fff1abf4e4aaec40e9474ef1bfd6bb0 Apr 24 16:39:10.726358 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:10.726319 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5357c94_3bc4_4825_808b_68599eb79e96.slice/crio-607c025df3b0978084096f97cbc514fd1092f38a35823b3282cf2420c9632f51 WatchSource:0}: Error finding container 607c025df3b0978084096f97cbc514fd1092f38a35823b3282cf2420c9632f51: Status 404 returned error can't find the container with id 607c025df3b0978084096f97cbc514fd1092f38a35823b3282cf2420c9632f51 Apr 24 16:39:10.729520 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:10.729497 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6db26d97_eaa5_4cf6_9b9c_a4d322db5952.slice/crio-97355c1ac419cdae6bb814a8eacce60c2e6119c46534e7b199a2f4de9640bf44 WatchSource:0}: Error finding container 97355c1ac419cdae6bb814a8eacce60c2e6119c46534e7b199a2f4de9640bf44: Status 404 returned error can't find the container with id 97355c1ac419cdae6bb814a8eacce60c2e6119c46534e7b199a2f4de9640bf44 Apr 24 16:39:10.730363 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:10.730323 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc22d7668_64ef_42e4_ac7b_d8eb07a69e1f.slice/crio-3fe0c0cd0d42088fba89dffa89dd2be8345feaa53c4f8b8a62fe673bba7dcfe6 WatchSource:0}: Error finding container 3fe0c0cd0d42088fba89dffa89dd2be8345feaa53c4f8b8a62fe673bba7dcfe6: Status 404 returned error can't find the container with id 3fe0c0cd0d42088fba89dffa89dd2be8345feaa53c4f8b8a62fe673bba7dcfe6 Apr 24 16:39:10.731473 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:10.731449 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f1e88cc_ea55_4772_9908_69b5f02b2d4d.slice/crio-133e7f0bc2f622e3bdd64fbff03ffbb16f3c97bd2c133c286e4ebff26f4a03dc WatchSource:0}: Error finding container 133e7f0bc2f622e3bdd64fbff03ffbb16f3c97bd2c133c286e4ebff26f4a03dc: Status 404 returned error can't find the container with id 133e7f0bc2f622e3bdd64fbff03ffbb16f3c97bd2c133c286e4ebff26f4a03dc Apr 24 16:39:10.731945 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:10.731907 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod113f984b_f618_43f6_b9a2_3a6b1e63676b.slice/crio-04fece136acc15aa436b68ad19d84d75281270eecd7935718747c87c52009028 WatchSource:0}: Error finding container 04fece136acc15aa436b68ad19d84d75281270eecd7935718747c87c52009028: Status 404 returned error can't find the container with id 04fece136acc15aa436b68ad19d84d75281270eecd7935718747c87c52009028 Apr 24 16:39:10.733660 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:10.733498 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod863af5bc_e499_4451_82ca_2c9265a5df62.slice/crio-1a87d9d77416462f05bf8a918c6c4cf7595b94a237c563bc5236b88121a4e0a6 WatchSource:0}: Error finding container 1a87d9d77416462f05bf8a918c6c4cf7595b94a237c563bc5236b88121a4e0a6: Status 404 returned error can't find the container with id 1a87d9d77416462f05bf8a918c6c4cf7595b94a237c563bc5236b88121a4e0a6 Apr 24 16:39:10.733916 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:10.733783 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a9da436_211c_45bd_9f7b_51e5eea9f69e.slice/crio-fb7ff8d6a8af320a4e6a88d2c42d771477f95c3429ae86f7d9a40fb8f4fe2401 WatchSource:0}: Error finding container fb7ff8d6a8af320a4e6a88d2c42d771477f95c3429ae86f7d9a40fb8f4fe2401: Status 404 returned error can't find the container with id fb7ff8d6a8af320a4e6a88d2c42d771477f95c3429ae86f7d9a40fb8f4fe2401 Apr 24 16:39:11.079236 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:11.078711 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:34:09 +0000 UTC" deadline="2028-01-31 07:09:00.801274166 +0000 UTC" Apr 24 16:39:11.079236 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:11.079091 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15518h29m49.722208834s" Apr 24 16:39:11.173734 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:11.173194 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:11.173734 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:11.173333 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j4684" podUID="6235daff-03fe-4662-a803-e1884c643b19" Apr 24 16:39:11.185508 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:11.185462 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-t5dcg" event={"ID":"6db26d97-eaa5-4cf6-9b9c-a4d322db5952","Type":"ContainerStarted","Data":"97355c1ac419cdae6bb814a8eacce60c2e6119c46534e7b199a2f4de9640bf44"} Apr 24 16:39:11.191199 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:11.190490 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-144.ec2.internal" event={"ID":"0475c0131c709c68e4c47751862dac8e","Type":"ContainerStarted","Data":"bb7348e4f245aca576e964f932074fc8888202c0e79fbf06deed7129399cd31e"} Apr 24 16:39:11.203517 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:11.203451 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xlhds" event={"ID":"863af5bc-e499-4451-82ca-2c9265a5df62","Type":"ContainerStarted","Data":"1a87d9d77416462f05bf8a918c6c4cf7595b94a237c563bc5236b88121a4e0a6"} Apr 24 16:39:11.205225 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:11.205073 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-144.ec2.internal" podStartSLOduration=2.205057442 podStartE2EDuration="2.205057442s" podCreationTimestamp="2026-04-24 16:39:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:39:11.204262381 +0000 UTC m=+3.661576253" watchObservedRunningTime="2026-04-24 16:39:11.205057442 +0000 UTC m=+3.662371314" Apr 24 16:39:11.220587 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:11.218633 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" event={"ID":"113f984b-f618-43f6-b9a2-3a6b1e63676b","Type":"ContainerStarted","Data":"04fece136acc15aa436b68ad19d84d75281270eecd7935718747c87c52009028"} Apr 24 16:39:11.221871 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:11.221829 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xjmsw" event={"ID":"8f1e88cc-ea55-4772-9908-69b5f02b2d4d","Type":"ContainerStarted","Data":"133e7f0bc2f622e3bdd64fbff03ffbb16f3c97bd2c133c286e4ebff26f4a03dc"} Apr 24 16:39:11.226055 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:11.224874 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" event={"ID":"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f","Type":"ContainerStarted","Data":"3fe0c0cd0d42088fba89dffa89dd2be8345feaa53c4f8b8a62fe673bba7dcfe6"} Apr 24 16:39:11.231208 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:11.231117 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-w297f" event={"ID":"e5357c94-3bc4-4825-808b-68599eb79e96","Type":"ContainerStarted","Data":"607c025df3b0978084096f97cbc514fd1092f38a35823b3282cf2420c9632f51"} Apr 24 16:39:11.239715 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:11.239685 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vlffk" event={"ID":"838b0ff1-a4b5-4f05-9901-2291fd5a85a4","Type":"ContainerStarted","Data":"ff034f7235ba881bd7b0441ddc3e2ee55fff1abf4e4aaec40e9474ef1bfd6bb0"} Apr 24 16:39:11.245116 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:11.245056 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" event={"ID":"301a467d-8583-49ee-aea7-425e83b7c4bf","Type":"ContainerStarted","Data":"d7e10cc3dd279c4b939834c277d3d672078a47ce69b7a6551e0e4008228b8c47"} Apr 24 16:39:11.250844 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:11.250787 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m9vmw" event={"ID":"9a9da436-211c-45bd-9f7b-51e5eea9f69e","Type":"ContainerStarted","Data":"fb7ff8d6a8af320a4e6a88d2c42d771477f95c3429ae86f7d9a40fb8f4fe2401"} Apr 24 16:39:11.669125 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:11.667384 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs\") pod \"network-metrics-daemon-wp5zn\" (UID: \"93f47728-cffc-4da9-9791-92c3d70ac2d2\") " pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:11.669125 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:11.667447 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84m58\" (UniqueName: \"kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58\") pod \"network-check-target-j4684\" (UID: \"6235daff-03fe-4662-a803-e1884c643b19\") " pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:11.669125 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:11.667591 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:11.669125 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:11.667611 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:11.669125 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:11.667625 2562 projected.go:194] Error preparing data for projected volume kube-api-access-84m58 for pod openshift-network-diagnostics/network-check-target-j4684: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:11.669125 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:11.667686 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58 podName:6235daff-03fe-4662-a803-e1884c643b19 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:13.6676678 +0000 UTC m=+6.124981656 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-84m58" (UniqueName: "kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58") pod "network-check-target-j4684" (UID: "6235daff-03fe-4662-a803-e1884c643b19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:11.669125 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:11.668124 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:11.669125 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:11.668174 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs podName:93f47728-cffc-4da9-9791-92c3d70ac2d2 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:13.668157549 +0000 UTC m=+6.125471402 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs") pod "network-metrics-daemon-wp5zn" (UID: "93f47728-cffc-4da9-9791-92c3d70ac2d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:12.174791 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:12.174303 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:12.174791 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:12.174444 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wp5zn" podUID="93f47728-cffc-4da9-9791-92c3d70ac2d2" Apr 24 16:39:12.268282 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:12.267863 2562 generic.go:358] "Generic (PLEG): container finished" podID="58d73468b484d3d9535d53ba0e0c885d" containerID="5f75932b65ce299144f383171b4ce932674c6e0e5fda0b7b65bc6edf2257af96" exitCode=0 Apr 24 16:39:12.268833 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:12.268806 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal" event={"ID":"58d73468b484d3d9535d53ba0e0c885d","Type":"ContainerDied","Data":"5f75932b65ce299144f383171b4ce932674c6e0e5fda0b7b65bc6edf2257af96"} Apr 24 16:39:13.173188 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:13.173049 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:13.173357 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:13.173195 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j4684" podUID="6235daff-03fe-4662-a803-e1884c643b19" Apr 24 16:39:13.276553 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:13.275889 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal" event={"ID":"58d73468b484d3d9535d53ba0e0c885d","Type":"ContainerStarted","Data":"36e2172fd679a5a9337af643bd128b6e230f61b7b6dd8efdbbd60fe48156b104"} Apr 24 16:39:13.684685 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:13.684645 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs\") pod \"network-metrics-daemon-wp5zn\" (UID: \"93f47728-cffc-4da9-9791-92c3d70ac2d2\") " pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:13.684880 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:13.684707 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84m58\" (UniqueName: \"kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58\") pod \"network-check-target-j4684\" (UID: \"6235daff-03fe-4662-a803-e1884c643b19\") " pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:13.684880 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:13.684857 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:13.684880 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:13.684878 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:13.685060 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:13.684893 2562 projected.go:194] Error preparing data for projected volume kube-api-access-84m58 for pod openshift-network-diagnostics/network-check-target-j4684: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:13.685060 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:13.685035 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:13.685162 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:13.685083 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs podName:93f47728-cffc-4da9-9791-92c3d70ac2d2 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:17.685065188 +0000 UTC m=+10.142379039 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs") pod "network-metrics-daemon-wp5zn" (UID: "93f47728-cffc-4da9-9791-92c3d70ac2d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:13.685162 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:13.685157 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58 podName:6235daff-03fe-4662-a803-e1884c643b19 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:17.685132578 +0000 UTC m=+10.142446440 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-84m58" (UniqueName: "kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58") pod "network-check-target-j4684" (UID: "6235daff-03fe-4662-a803-e1884c643b19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:14.173744 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:14.173665 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:14.173914 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:14.173817 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wp5zn" podUID="93f47728-cffc-4da9-9791-92c3d70ac2d2" Apr 24 16:39:15.172752 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:15.172718 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:15.173297 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:15.172854 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j4684" podUID="6235daff-03fe-4662-a803-e1884c643b19" Apr 24 16:39:16.174679 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:16.174173 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:16.174679 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:16.174314 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wp5zn" podUID="93f47728-cffc-4da9-9791-92c3d70ac2d2" Apr 24 16:39:17.173738 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:17.173705 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:17.173909 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:17.173838 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j4684" podUID="6235daff-03fe-4662-a803-e1884c643b19" Apr 24 16:39:17.721861 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:17.721817 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84m58\" (UniqueName: \"kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58\") pod \"network-check-target-j4684\" (UID: \"6235daff-03fe-4662-a803-e1884c643b19\") " pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:17.722339 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:17.721950 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs\") pod \"network-metrics-daemon-wp5zn\" (UID: \"93f47728-cffc-4da9-9791-92c3d70ac2d2\") " pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:17.722339 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:17.722080 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:17.722339 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:17.722104 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:17.722339 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:17.722132 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:17.722339 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:17.722143 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs podName:93f47728-cffc-4da9-9791-92c3d70ac2d2 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:25.722124237 +0000 UTC m=+18.179438089 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs") pod "network-metrics-daemon-wp5zn" (UID: "93f47728-cffc-4da9-9791-92c3d70ac2d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:17.722339 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:17.722147 2562 projected.go:194] Error preparing data for projected volume kube-api-access-84m58 for pod openshift-network-diagnostics/network-check-target-j4684: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:17.722339 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:17.722202 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58 podName:6235daff-03fe-4662-a803-e1884c643b19 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:25.722184529 +0000 UTC m=+18.179498379 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-84m58" (UniqueName: "kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58") pod "network-check-target-j4684" (UID: "6235daff-03fe-4662-a803-e1884c643b19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:18.174771 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:18.174320 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:18.174771 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:18.174443 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wp5zn" podUID="93f47728-cffc-4da9-9791-92c3d70ac2d2" Apr 24 16:39:19.172879 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:19.172841 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:19.173329 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:19.172994 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j4684" podUID="6235daff-03fe-4662-a803-e1884c643b19" Apr 24 16:39:20.173468 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:20.173424 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:20.173849 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:20.173577 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wp5zn" podUID="93f47728-cffc-4da9-9791-92c3d70ac2d2" Apr 24 16:39:21.172883 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:21.172852 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:21.173102 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:21.172991 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j4684" podUID="6235daff-03fe-4662-a803-e1884c643b19" Apr 24 16:39:22.173431 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:22.173398 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:22.173889 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:22.173543 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wp5zn" podUID="93f47728-cffc-4da9-9791-92c3d70ac2d2" Apr 24 16:39:23.173063 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:23.173033 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:23.173234 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:23.173126 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j4684" podUID="6235daff-03fe-4662-a803-e1884c643b19" Apr 24 16:39:24.173276 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:24.173243 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:24.173704 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:24.173378 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wp5zn" podUID="93f47728-cffc-4da9-9791-92c3d70ac2d2" Apr 24 16:39:25.173682 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:25.173638 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:25.174165 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:25.173777 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j4684" podUID="6235daff-03fe-4662-a803-e1884c643b19" Apr 24 16:39:25.771737 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:25.771692 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs\") pod \"network-metrics-daemon-wp5zn\" (UID: \"93f47728-cffc-4da9-9791-92c3d70ac2d2\") " pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:25.771905 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:25.771752 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84m58\" (UniqueName: \"kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58\") pod \"network-check-target-j4684\" (UID: \"6235daff-03fe-4662-a803-e1884c643b19\") " pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:25.771905 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:25.771873 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:25.771905 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:25.771886 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:25.771905 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:25.771906 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:25.772101 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:25.771919 2562 projected.go:194] Error preparing data for projected volume kube-api-access-84m58 for pod openshift-network-diagnostics/network-check-target-j4684: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:25.772101 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:25.771966 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs podName:93f47728-cffc-4da9-9791-92c3d70ac2d2 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:41.771942507 +0000 UTC m=+34.229256371 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs") pod "network-metrics-daemon-wp5zn" (UID: "93f47728-cffc-4da9-9791-92c3d70ac2d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:25.772101 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:25.771983 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58 podName:6235daff-03fe-4662-a803-e1884c643b19 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:41.771976616 +0000 UTC m=+34.229290464 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-84m58" (UniqueName: "kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58") pod "network-check-target-j4684" (UID: "6235daff-03fe-4662-a803-e1884c643b19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:26.173713 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:26.173636 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:26.174085 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:26.173745 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wp5zn" podUID="93f47728-cffc-4da9-9791-92c3d70ac2d2" Apr 24 16:39:27.173624 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:27.173586 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:27.173812 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:27.173724 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j4684" podUID="6235daff-03fe-4662-a803-e1884c643b19" Apr 24 16:39:28.174280 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:28.174060 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:28.174747 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:28.174314 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wp5zn" podUID="93f47728-cffc-4da9-9791-92c3d70ac2d2" Apr 24 16:39:28.302484 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:28.302436 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xjmsw" event={"ID":"8f1e88cc-ea55-4772-9908-69b5f02b2d4d","Type":"ContainerStarted","Data":"554997922eab52e8dfc5fcb22c80552e6427cd5c828f87ca27798ce7ef2563c5"} Apr 24 16:39:28.303954 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:28.303908 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-w297f" event={"ID":"e5357c94-3bc4-4825-808b-68599eb79e96","Type":"ContainerStarted","Data":"febb945b7850aa32b50e12908788ab2709edb67453a94243658f270d0d1312d3"} Apr 24 16:39:28.305340 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:28.305307 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vlffk" event={"ID":"838b0ff1-a4b5-4f05-9901-2291fd5a85a4","Type":"ContainerStarted","Data":"e96b26dc46ae99554549142be593fff9f434d32e660f77ef974157fe6e8c3721"} Apr 24 16:39:28.306817 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:28.306631 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" event={"ID":"301a467d-8583-49ee-aea7-425e83b7c4bf","Type":"ContainerStarted","Data":"7d8635827e7197d094592cbf0e8dba3062086b602d3d1c13eff6126717952df2"} Apr 24 16:39:28.308209 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:28.308174 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m9vmw" event={"ID":"9a9da436-211c-45bd-9f7b-51e5eea9f69e","Type":"ContainerStarted","Data":"b27bea97680e76c6f6fed9d04c3212de91609feac2e118eb0b54eafda70cc383"} Apr 24 16:39:28.309809 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:28.309735 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-t5dcg" event={"ID":"6db26d97-eaa5-4cf6-9b9c-a4d322db5952","Type":"ContainerStarted","Data":"a8be117dfa7c968e7531d74f2176839ffa53a2f72b25cebb68053715cb42b1cb"} Apr 24 16:39:28.311290 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:28.311264 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" event={"ID":"113f984b-f618-43f6-b9a2-3a6b1e63676b","Type":"ContainerStarted","Data":"dcce5b86581385b12cae6d27ff6f4899977af675d847d7c14d9e81675e35665c"} Apr 24 16:39:28.324651 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:28.324600 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-144.ec2.internal" podStartSLOduration=19.324586441 podStartE2EDuration="19.324586441s" podCreationTimestamp="2026-04-24 16:39:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:39:13.295072283 +0000 UTC m=+5.752386152" watchObservedRunningTime="2026-04-24 16:39:28.324586441 +0000 UTC m=+20.781900311" Apr 24 16:39:28.355365 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:28.355295 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vlffk" podStartSLOduration=3.203275834 podStartE2EDuration="20.355275553s" podCreationTimestamp="2026-04-24 16:39:08 +0000 UTC" firstStartedPulling="2026-04-24 16:39:10.725635967 +0000 UTC m=+3.182949815" lastFinishedPulling="2026-04-24 16:39:27.877635685 +0000 UTC m=+20.334949534" observedRunningTime="2026-04-24 16:39:28.340200096 +0000 UTC m=+20.797513984" watchObservedRunningTime="2026-04-24 16:39:28.355275553 +0000 UTC m=+20.812589424" Apr 24 16:39:28.355561 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:28.355523 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-t5dcg" podStartSLOduration=11.429313639 podStartE2EDuration="20.35551359s" podCreationTimestamp="2026-04-24 16:39:08 +0000 UTC" firstStartedPulling="2026-04-24 16:39:10.73128935 +0000 UTC m=+3.188603205" lastFinishedPulling="2026-04-24 16:39:19.657489301 +0000 UTC m=+12.114803156" observedRunningTime="2026-04-24 16:39:28.35526921 +0000 UTC m=+20.812583081" watchObservedRunningTime="2026-04-24 16:39:28.35551359 +0000 UTC m=+20.812827462" Apr 24 16:39:28.370292 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:28.370241 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-w297f" podStartSLOduration=3.240902348 podStartE2EDuration="20.370226112s" podCreationTimestamp="2026-04-24 16:39:08 +0000 UTC" firstStartedPulling="2026-04-24 16:39:10.728277035 +0000 UTC m=+3.185590883" lastFinishedPulling="2026-04-24 16:39:27.857600782 +0000 UTC m=+20.314914647" observedRunningTime="2026-04-24 16:39:28.369840454 +0000 UTC m=+20.827154327" watchObservedRunningTime="2026-04-24 16:39:28.370226112 +0000 UTC m=+20.827539983" Apr 24 16:39:28.394079 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:28.394027 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-m9vmw" podStartSLOduration=3.472169003 podStartE2EDuration="20.394009491s" podCreationTimestamp="2026-04-24 16:39:08 +0000 UTC" firstStartedPulling="2026-04-24 16:39:10.735377388 +0000 UTC m=+3.192691237" lastFinishedPulling="2026-04-24 16:39:27.65721787 +0000 UTC m=+20.114531725" observedRunningTime="2026-04-24 16:39:28.393587622 +0000 UTC m=+20.850901495" watchObservedRunningTime="2026-04-24 16:39:28.394009491 +0000 UTC m=+20.851323363" Apr 24 16:39:28.554323 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:28.554292 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-t5dcg" Apr 24 16:39:28.978507 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:28.978482 2562 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 16:39:29.102783 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:29.102697 2562 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T16:39:28.978503235Z","UUID":"30af6989-9519-472c-bf9b-d072330046ed","Handler":null,"Name":"","Endpoint":""} Apr 24 16:39:29.105248 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:29.105228 2562 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 16:39:29.105399 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:29.105254 2562 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 16:39:29.172677 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:29.172648 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:29.172806 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:29.172753 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j4684" podUID="6235daff-03fe-4662-a803-e1884c643b19" Apr 24 16:39:29.314354 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:29.314316 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xlhds" event={"ID":"863af5bc-e499-4451-82ca-2c9265a5df62","Type":"ContainerStarted","Data":"0d3fa2ff02f380e786afbc260d79f49749fd68ece0379f8a5baf22c4d7af6742"} Apr 24 16:39:29.315721 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:29.315694 2562 generic.go:358] "Generic (PLEG): container finished" podID="8f1e88cc-ea55-4772-9908-69b5f02b2d4d" containerID="554997922eab52e8dfc5fcb22c80552e6427cd5c828f87ca27798ce7ef2563c5" exitCode=0 Apr 24 16:39:29.315837 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:29.315768 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xjmsw" event={"ID":"8f1e88cc-ea55-4772-9908-69b5f02b2d4d","Type":"ContainerDied","Data":"554997922eab52e8dfc5fcb22c80552e6427cd5c828f87ca27798ce7ef2563c5"} Apr 24 16:39:29.319390 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:29.319362 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" event={"ID":"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f","Type":"ContainerStarted","Data":"a31d7c240bf7e43d6955dcbb7b7fc16dba24de6ef212bd7efd57fffb681529b7"} Apr 24 16:39:29.319517 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:29.319408 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" event={"ID":"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f","Type":"ContainerStarted","Data":"b233571fcafa0c3907ca0b89a7d17b6104d31a3c05613ca6ee53f2a8fd0757d9"} Apr 24 16:39:29.319517 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:29.319422 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" event={"ID":"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f","Type":"ContainerStarted","Data":"be3b5d13ec2ef28effbfaf48e3f4bf3371fea470aeefc37736dcb83ac53cd1d6"} Apr 24 16:39:29.319517 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:29.319435 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" event={"ID":"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f","Type":"ContainerStarted","Data":"aef20d697adbed44b20e7aab55cdf81e1f67cb003132755d17885cdc0280811e"} Apr 24 16:39:29.319517 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:29.319453 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" event={"ID":"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f","Type":"ContainerStarted","Data":"efaa4acdea44bc7749609d643e040225836440d2d6ebb0be45badb7fc08fe937"} Apr 24 16:39:29.319517 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:29.319473 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" event={"ID":"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f","Type":"ContainerStarted","Data":"9f19440a2738a007b64cda0af243f5f8f831945ec89e18363ffd057083af6bf1"} Apr 24 16:39:29.322234 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:29.322206 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" event={"ID":"301a467d-8583-49ee-aea7-425e83b7c4bf","Type":"ContainerStarted","Data":"6ad012d758b63ea45d78de2c352c7ca03b9991ecb37411db73769b9026754202"} Apr 24 16:39:29.329855 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:29.329813 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xlhds" podStartSLOduration=4.407643732 podStartE2EDuration="21.329801026s" podCreationTimestamp="2026-04-24 16:39:08 +0000 UTC" firstStartedPulling="2026-04-24 16:39:10.73506677 +0000 UTC m=+3.192380619" lastFinishedPulling="2026-04-24 16:39:27.657224057 +0000 UTC m=+20.114537913" observedRunningTime="2026-04-24 16:39:29.329591431 +0000 UTC m=+21.786905303" watchObservedRunningTime="2026-04-24 16:39:29.329801026 +0000 UTC m=+21.787114899" Apr 24 16:39:29.330117 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:29.330087 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xkqhq" podStartSLOduration=4.20439924 podStartE2EDuration="21.33008149s" podCreationTimestamp="2026-04-24 16:39:08 +0000 UTC" firstStartedPulling="2026-04-24 16:39:10.73417605 +0000 UTC m=+3.191489903" lastFinishedPulling="2026-04-24 16:39:27.859858304 +0000 UTC m=+20.317172153" observedRunningTime="2026-04-24 16:39:28.418590003 +0000 UTC m=+20.875903885" watchObservedRunningTime="2026-04-24 16:39:29.33008149 +0000 UTC m=+21.787395360" Apr 24 16:39:30.173805 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:30.173488 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:30.174034 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:30.173861 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wp5zn" podUID="93f47728-cffc-4da9-9791-92c3d70ac2d2" Apr 24 16:39:30.325852 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:30.325814 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" event={"ID":"301a467d-8583-49ee-aea7-425e83b7c4bf","Type":"ContainerStarted","Data":"7d11d4fdd5bb85f5025a17ea0d5d4ce92d4d44393b472f1cdafa3e44c3a711bc"} Apr 24 16:39:30.342832 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:30.342782 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vdgj7" podStartSLOduration=3.361676531 podStartE2EDuration="22.342768042s" podCreationTimestamp="2026-04-24 16:39:08 +0000 UTC" firstStartedPulling="2026-04-24 16:39:10.724722204 +0000 UTC m=+3.182036052" lastFinishedPulling="2026-04-24 16:39:29.70581371 +0000 UTC m=+22.163127563" observedRunningTime="2026-04-24 16:39:30.342601902 +0000 UTC m=+22.799915799" watchObservedRunningTime="2026-04-24 16:39:30.342768042 +0000 UTC m=+22.800081913" Apr 24 16:39:31.173381 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:31.173348 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:31.173567 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:31.173468 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j4684" podUID="6235daff-03fe-4662-a803-e1884c643b19" Apr 24 16:39:31.331653 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:31.331611 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" event={"ID":"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f","Type":"ContainerStarted","Data":"fc3d844e7010f0560442e4d93c60cfaf4e7c11e3dd3a143e4de2b8e76026060d"} Apr 24 16:39:32.172797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:32.172757 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:32.172988 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:32.172886 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wp5zn" podUID="93f47728-cffc-4da9-9791-92c3d70ac2d2" Apr 24 16:39:32.432331 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:32.432258 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-t5dcg" Apr 24 16:39:32.432876 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:32.432852 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-t5dcg" Apr 24 16:39:33.173676 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:33.173595 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:33.173843 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:33.173727 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j4684" podUID="6235daff-03fe-4662-a803-e1884c643b19" Apr 24 16:39:33.340368 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:33.339996 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" event={"ID":"c22d7668-64ef-42e4-ac7b-d8eb07a69e1f","Type":"ContainerStarted","Data":"5c5094c9e0217e636a1c4f9f087d71e3591afaafe371b81b7773eb42b040a558"} Apr 24 16:39:33.340368 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:33.340278 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:33.340368 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:33.340316 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:33.340562 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:33.340424 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:33.341570 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:33.341378 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-t5dcg" Apr 24 16:39:33.362763 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:33.360417 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:33.362763 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:33.360734 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:39:33.371208 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:33.371058 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" podStartSLOduration=7.848168003 podStartE2EDuration="25.371044084s" podCreationTimestamp="2026-04-24 16:39:08 +0000 UTC" firstStartedPulling="2026-04-24 16:39:10.732753692 +0000 UTC m=+3.190067556" lastFinishedPulling="2026-04-24 16:39:28.255629788 +0000 UTC m=+20.712943637" observedRunningTime="2026-04-24 16:39:33.370852388 +0000 UTC m=+25.828166259" watchObservedRunningTime="2026-04-24 16:39:33.371044084 +0000 UTC m=+25.828357955" Apr 24 16:39:34.173212 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:34.173182 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:34.173817 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:34.173323 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wp5zn" podUID="93f47728-cffc-4da9-9791-92c3d70ac2d2" Apr 24 16:39:34.344481 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:34.344445 2562 generic.go:358] "Generic (PLEG): container finished" podID="8f1e88cc-ea55-4772-9908-69b5f02b2d4d" containerID="8362574343d4f44fd27de7facb7ebf3f779fe5e3f15b8380bce40f6b1ea4fdd1" exitCode=0 Apr 24 16:39:34.344632 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:34.344478 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xjmsw" event={"ID":"8f1e88cc-ea55-4772-9908-69b5f02b2d4d","Type":"ContainerDied","Data":"8362574343d4f44fd27de7facb7ebf3f779fe5e3f15b8380bce40f6b1ea4fdd1"} Apr 24 16:39:35.173816 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:35.173651 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:35.174177 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:35.173885 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j4684" podUID="6235daff-03fe-4662-a803-e1884c643b19" Apr 24 16:39:35.231602 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:35.231573 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-j4684"] Apr 24 16:39:35.233353 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:35.233327 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wp5zn"] Apr 24 16:39:35.233596 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:35.233455 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:35.233596 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:35.233568 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wp5zn" podUID="93f47728-cffc-4da9-9791-92c3d70ac2d2" Apr 24 16:39:35.348516 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:35.348431 2562 generic.go:358] "Generic (PLEG): container finished" podID="8f1e88cc-ea55-4772-9908-69b5f02b2d4d" containerID="fa279c377a3c8267db6635e4d474d04da50038ff58348a5277285e93723901fe" exitCode=0 Apr 24 16:39:35.348516 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:35.348511 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:35.348702 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:35.348515 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xjmsw" event={"ID":"8f1e88cc-ea55-4772-9908-69b5f02b2d4d","Type":"ContainerDied","Data":"fa279c377a3c8267db6635e4d474d04da50038ff58348a5277285e93723901fe"} Apr 24 16:39:35.348836 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:35.348809 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j4684" podUID="6235daff-03fe-4662-a803-e1884c643b19" Apr 24 16:39:36.354882 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:36.354851 2562 generic.go:358] "Generic (PLEG): container finished" podID="8f1e88cc-ea55-4772-9908-69b5f02b2d4d" containerID="cd51bfcecaf8c48d98ba8eb2e753dcd217d392f95d7e21234ef33cfca38ae504" exitCode=0 Apr 24 16:39:36.355258 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:36.354908 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xjmsw" event={"ID":"8f1e88cc-ea55-4772-9908-69b5f02b2d4d","Type":"ContainerDied","Data":"cd51bfcecaf8c48d98ba8eb2e753dcd217d392f95d7e21234ef33cfca38ae504"} Apr 24 16:39:37.173776 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:37.173687 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:37.173928 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:37.173696 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:37.173928 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:37.173820 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j4684" podUID="6235daff-03fe-4662-a803-e1884c643b19" Apr 24 16:39:37.173928 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:37.173893 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wp5zn" podUID="93f47728-cffc-4da9-9791-92c3d70ac2d2" Apr 24 16:39:39.172861 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:39.172781 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:39.172861 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:39.172821 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:39.173594 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:39.172906 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j4684" podUID="6235daff-03fe-4662-a803-e1884c643b19" Apr 24 16:39:39.173594 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:39.173039 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wp5zn" podUID="93f47728-cffc-4da9-9791-92c3d70ac2d2" Apr 24 16:39:40.841338 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:40.841298 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-144.ec2.internal" event="NodeReady" Apr 24 16:39:40.841855 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:40.841460 2562 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 16:39:40.921255 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:40.921161 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-56d5dcfb77-vmzmz"] Apr 24 16:39:40.946741 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:40.946700 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:40.951180 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:40.951155 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56d5dcfb77-vmzmz"] Apr 24 16:39:40.955572 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:40.955119 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 16:39:40.955572 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:40.955163 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-l6l2l\"" Apr 24 16:39:40.955792 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:40.955772 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 16:39:40.956650 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:40.956585 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 16:39:40.961137 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:40.961114 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 16:39:40.982116 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:40.982087 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dj6nz"] Apr 24 16:39:40.999580 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:40.999551 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zgxq7"] Apr 24 16:39:40.999729 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:40.999596 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dj6nz" Apr 24 16:39:41.011401 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.011265 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 16:39:41.011401 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.011291 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 16:39:41.011596 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.011553 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6n2jd\"" Apr 24 16:39:41.017996 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.017973 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dj6nz"] Apr 24 16:39:41.018127 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.018104 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:39:41.023743 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.023708 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-szcnf\"" Apr 24 16:39:41.023837 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.023792 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 16:39:41.023837 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.023814 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 16:39:41.024039 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.024021 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 16:39:41.030508 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.030484 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zgxq7"] Apr 24 16:39:41.081911 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.081867 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:39:41.081911 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.081912 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e8f243a6-3111-4778-9ca9-092db5973836-registry-certificates\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.082166 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.081951 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e8f243a6-3111-4778-9ca9-092db5973836-installation-pull-secrets\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.082166 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.081976 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6pcs\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-kube-api-access-n6pcs\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.082166 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.082036 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e8f243a6-3111-4778-9ca9-092db5973836-image-registry-private-configuration\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.082166 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.082074 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-tmp-dir\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:39:41.082166 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.082100 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.082510 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.082171 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-config-volume\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:39:41.082510 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.082215 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-bound-sa-token\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.082510 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.082277 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert\") pod \"ingress-canary-zgxq7\" (UID: \"444f46c8-3c9a-4e72-8000-ca142ae511ef\") " pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:39:41.082510 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.082303 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g62sd\" (UniqueName: \"kubernetes.io/projected/444f46c8-3c9a-4e72-8000-ca142ae511ef-kube-api-access-g62sd\") pod \"ingress-canary-zgxq7\" (UID: \"444f46c8-3c9a-4e72-8000-ca142ae511ef\") " pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:39:41.082510 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.082329 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nfpq\" (UniqueName: \"kubernetes.io/projected/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-kube-api-access-9nfpq\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:39:41.082510 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.082387 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e8f243a6-3111-4778-9ca9-092db5973836-trusted-ca\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.082510 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.082421 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e8f243a6-3111-4778-9ca9-092db5973836-ca-trust-extracted\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.173511 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.173429 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:41.173685 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.173420 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:41.176640 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.176614 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 16:39:41.176640 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.176637 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 16:39:41.176844 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.176684 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-v7jl2\"" Apr 24 16:39:41.176844 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.176710 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 16:39:41.177062 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.177043 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xv2x8\"" Apr 24 16:39:41.185299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.183200 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e8f243a6-3111-4778-9ca9-092db5973836-trusted-ca\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.185299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.183256 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e8f243a6-3111-4778-9ca9-092db5973836-ca-trust-extracted\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.185299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.183285 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:39:41.185299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.183318 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e8f243a6-3111-4778-9ca9-092db5973836-registry-certificates\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.185299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.183352 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e8f243a6-3111-4778-9ca9-092db5973836-installation-pull-secrets\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.185299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.183386 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6pcs\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-kube-api-access-n6pcs\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.185299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.183423 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e8f243a6-3111-4778-9ca9-092db5973836-image-registry-private-configuration\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.185299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.183452 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-tmp-dir\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:39:41.185299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.183483 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.185299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.183516 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-config-volume\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:39:41.185299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.183547 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-bound-sa-token\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.185299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.183600 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert\") pod \"ingress-canary-zgxq7\" (UID: \"444f46c8-3c9a-4e72-8000-ca142ae511ef\") " pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:39:41.185299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.183635 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g62sd\" (UniqueName: \"kubernetes.io/projected/444f46c8-3c9a-4e72-8000-ca142ae511ef-kube-api-access-g62sd\") pod \"ingress-canary-zgxq7\" (UID: \"444f46c8-3c9a-4e72-8000-ca142ae511ef\") " pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:39:41.185299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.183665 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nfpq\" (UniqueName: \"kubernetes.io/projected/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-kube-api-access-9nfpq\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:39:41.185299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.185000 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e8f243a6-3111-4778-9ca9-092db5973836-trusted-ca\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.185299 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:41.185084 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:41.186156 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:41.185196 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls podName:bf7a70b6-9c3d-42f2-912b-ae46ee6adb21 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:41.685166331 +0000 UTC m=+34.142480199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls") pod "dns-default-dj6nz" (UID: "bf7a70b6-9c3d-42f2-912b-ae46ee6adb21") : secret "dns-default-metrics-tls" not found Apr 24 16:39:41.186156 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.185367 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-tmp-dir\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:39:41.186156 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:41.185654 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:41.186156 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:41.185662 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:41.186156 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:41.185680 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56d5dcfb77-vmzmz: secret "image-registry-tls" not found Apr 24 16:39:41.186156 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:41.185736 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert podName:444f46c8-3c9a-4e72-8000-ca142ae511ef nodeName:}" failed. No retries permitted until 2026-04-24 16:39:41.685711474 +0000 UTC m=+34.143025334 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert") pod "ingress-canary-zgxq7" (UID: "444f46c8-3c9a-4e72-8000-ca142ae511ef") : secret "canary-serving-cert" not found Apr 24 16:39:41.186156 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:41.185764 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls podName:e8f243a6-3111-4778-9ca9-092db5973836 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:41.685749774 +0000 UTC m=+34.143063626 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls") pod "image-registry-56d5dcfb77-vmzmz" (UID: "e8f243a6-3111-4778-9ca9-092db5973836") : secret "image-registry-tls" not found Apr 24 16:39:41.186156 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.185769 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e8f243a6-3111-4778-9ca9-092db5973836-ca-trust-extracted\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.186156 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.186010 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-config-volume\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:39:41.186156 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.186147 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e8f243a6-3111-4778-9ca9-092db5973836-registry-certificates\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.190003 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.189980 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e8f243a6-3111-4778-9ca9-092db5973836-image-registry-private-configuration\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.190150 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.190073 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e8f243a6-3111-4778-9ca9-092db5973836-installation-pull-secrets\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.192815 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.192756 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nfpq\" (UniqueName: \"kubernetes.io/projected/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-kube-api-access-9nfpq\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:39:41.195264 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.195220 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6pcs\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-kube-api-access-n6pcs\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.197021 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.196997 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-bound-sa-token\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.197338 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.197319 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g62sd\" (UniqueName: \"kubernetes.io/projected/444f46c8-3c9a-4e72-8000-ca142ae511ef-kube-api-access-g62sd\") pod \"ingress-canary-zgxq7\" (UID: \"444f46c8-3c9a-4e72-8000-ca142ae511ef\") " pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:39:41.688224 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.688189 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:39:41.688224 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.688234 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:41.688569 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:41.688342 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:41.688569 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:41.688355 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56d5dcfb77-vmzmz: secret "image-registry-tls" not found Apr 24 16:39:41.688569 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:41.688370 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:41.688569 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.688398 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert\") pod \"ingress-canary-zgxq7\" (UID: \"444f46c8-3c9a-4e72-8000-ca142ae511ef\") " pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:39:41.688569 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:41.688418 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls podName:e8f243a6-3111-4778-9ca9-092db5973836 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:42.688398763 +0000 UTC m=+35.145712613 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls") pod "image-registry-56d5dcfb77-vmzmz" (UID: "e8f243a6-3111-4778-9ca9-092db5973836") : secret "image-registry-tls" not found Apr 24 16:39:41.688569 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:41.688459 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls podName:bf7a70b6-9c3d-42f2-912b-ae46ee6adb21 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:42.688441368 +0000 UTC m=+35.145755230 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls") pod "dns-default-dj6nz" (UID: "bf7a70b6-9c3d-42f2-912b-ae46ee6adb21") : secret "dns-default-metrics-tls" not found Apr 24 16:39:41.688569 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:41.688467 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:41.688569 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:41.688501 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert podName:444f46c8-3c9a-4e72-8000-ca142ae511ef nodeName:}" failed. No retries permitted until 2026-04-24 16:39:42.688490242 +0000 UTC m=+35.145804091 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert") pod "ingress-canary-zgxq7" (UID: "444f46c8-3c9a-4e72-8000-ca142ae511ef") : secret "canary-serving-cert" not found Apr 24 16:39:41.789638 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.789604 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs\") pod \"network-metrics-daemon-wp5zn\" (UID: \"93f47728-cffc-4da9-9791-92c3d70ac2d2\") " pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:39:41.789825 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.789670 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84m58\" (UniqueName: \"kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58\") pod \"network-check-target-j4684\" (UID: \"6235daff-03fe-4662-a803-e1884c643b19\") " pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:41.789825 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:41.789733 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 16:39:41.789825 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:41.789803 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs podName:93f47728-cffc-4da9-9791-92c3d70ac2d2 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:13.789784344 +0000 UTC m=+66.247098206 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs") pod "network-metrics-daemon-wp5zn" (UID: "93f47728-cffc-4da9-9791-92c3d70ac2d2") : secret "metrics-daemon-secret" not found Apr 24 16:39:41.797945 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.797903 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84m58\" (UniqueName: \"kubernetes.io/projected/6235daff-03fe-4662-a803-e1884c643b19-kube-api-access-84m58\") pod \"network-check-target-j4684\" (UID: \"6235daff-03fe-4662-a803-e1884c643b19\") " pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:41.807860 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:41.807831 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:42.050707 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:42.050544 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-j4684"] Apr 24 16:39:42.054200 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:42.054168 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6235daff_03fe_4662_a803_e1884c643b19.slice/crio-c409fbc3d795aca451e59e4c61776abde48f99f2b0b219f0b1a467eceb77102b WatchSource:0}: Error finding container c409fbc3d795aca451e59e4c61776abde48f99f2b0b219f0b1a467eceb77102b: Status 404 returned error can't find the container with id c409fbc3d795aca451e59e4c61776abde48f99f2b0b219f0b1a467eceb77102b Apr 24 16:39:42.367915 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:42.367870 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-j4684" event={"ID":"6235daff-03fe-4662-a803-e1884c643b19","Type":"ContainerStarted","Data":"c409fbc3d795aca451e59e4c61776abde48f99f2b0b219f0b1a467eceb77102b"} Apr 24 16:39:42.696482 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:42.696438 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:42.696714 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:42.696520 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert\") pod \"ingress-canary-zgxq7\" (UID: \"444f46c8-3c9a-4e72-8000-ca142ae511ef\") " pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:39:42.696714 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:42.696580 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:39:42.696714 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:42.696598 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:42.696714 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:42.696624 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56d5dcfb77-vmzmz: secret "image-registry-tls" not found Apr 24 16:39:42.696714 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:42.696679 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:42.696714 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:42.696695 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls podName:e8f243a6-3111-4778-9ca9-092db5973836 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:44.696671975 +0000 UTC m=+37.153985831 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls") pod "image-registry-56d5dcfb77-vmzmz" (UID: "e8f243a6-3111-4778-9ca9-092db5973836") : secret "image-registry-tls" not found Apr 24 16:39:42.697059 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:42.696733 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls podName:bf7a70b6-9c3d-42f2-912b-ae46ee6adb21 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:44.696717031 +0000 UTC m=+37.154030886 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls") pod "dns-default-dj6nz" (UID: "bf7a70b6-9c3d-42f2-912b-ae46ee6adb21") : secret "dns-default-metrics-tls" not found Apr 24 16:39:42.697059 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:42.696680 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:42.697059 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:42.696769 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert podName:444f46c8-3c9a-4e72-8000-ca142ae511ef nodeName:}" failed. No retries permitted until 2026-04-24 16:39:44.696760339 +0000 UTC m=+37.154074193 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert") pod "ingress-canary-zgxq7" (UID: "444f46c8-3c9a-4e72-8000-ca142ae511ef") : secret "canary-serving-cert" not found Apr 24 16:39:43.372814 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:43.372781 2562 generic.go:358] "Generic (PLEG): container finished" podID="8f1e88cc-ea55-4772-9908-69b5f02b2d4d" containerID="95269df070fb8f3580ff5ee5113fe322955741e131d71a2bead9dacab1e8b29c" exitCode=0 Apr 24 16:39:43.373315 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:43.372841 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xjmsw" event={"ID":"8f1e88cc-ea55-4772-9908-69b5f02b2d4d","Type":"ContainerDied","Data":"95269df070fb8f3580ff5ee5113fe322955741e131d71a2bead9dacab1e8b29c"} Apr 24 16:39:44.378134 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:44.378100 2562 generic.go:358] "Generic (PLEG): container finished" podID="8f1e88cc-ea55-4772-9908-69b5f02b2d4d" containerID="f4e3ded9682a81ed94d9aff315502699293537fae7234ff370b0c4b2b1be8d7a" exitCode=0 Apr 24 16:39:44.378580 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:44.378186 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xjmsw" event={"ID":"8f1e88cc-ea55-4772-9908-69b5f02b2d4d","Type":"ContainerDied","Data":"f4e3ded9682a81ed94d9aff315502699293537fae7234ff370b0c4b2b1be8d7a"} Apr 24 16:39:44.712740 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:44.712644 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:39:44.712740 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:44.712701 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:44.712740 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:44.712736 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert\") pod \"ingress-canary-zgxq7\" (UID: \"444f46c8-3c9a-4e72-8000-ca142ae511ef\") " pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:39:44.713055 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:44.712818 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:44.713055 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:44.712853 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:44.713055 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:44.712873 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56d5dcfb77-vmzmz: secret "image-registry-tls" not found Apr 24 16:39:44.713055 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:44.712904 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls podName:bf7a70b6-9c3d-42f2-912b-ae46ee6adb21 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:48.712882738 +0000 UTC m=+41.170196601 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls") pod "dns-default-dj6nz" (UID: "bf7a70b6-9c3d-42f2-912b-ae46ee6adb21") : secret "dns-default-metrics-tls" not found Apr 24 16:39:44.713055 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:44.712825 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:44.713055 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:44.712924 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls podName:e8f243a6-3111-4778-9ca9-092db5973836 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:48.712914525 +0000 UTC m=+41.170228375 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls") pod "image-registry-56d5dcfb77-vmzmz" (UID: "e8f243a6-3111-4778-9ca9-092db5973836") : secret "image-registry-tls" not found Apr 24 16:39:44.713055 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:44.712958 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert podName:444f46c8-3c9a-4e72-8000-ca142ae511ef nodeName:}" failed. No retries permitted until 2026-04-24 16:39:48.712949776 +0000 UTC m=+41.170263628 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert") pod "ingress-canary-zgxq7" (UID: "444f46c8-3c9a-4e72-8000-ca142ae511ef") : secret "canary-serving-cert" not found Apr 24 16:39:45.383726 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:45.383478 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xjmsw" event={"ID":"8f1e88cc-ea55-4772-9908-69b5f02b2d4d","Type":"ContainerStarted","Data":"c815370e089d46eda15b3ed2dc26b6f6cfd7c98e8522fb91a100bb7d3bfc354b"} Apr 24 16:39:45.384752 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:45.384722 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-j4684" event={"ID":"6235daff-03fe-4662-a803-e1884c643b19","Type":"ContainerStarted","Data":"f7ba3332b798aa042d1ad7c7d51cbe44d64cc8653d3b9fb08f226339de815ddb"} Apr 24 16:39:45.384878 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:45.384839 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:39:45.405513 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:45.405411 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xjmsw" podStartSLOduration=5.910668966 podStartE2EDuration="37.40539446s" podCreationTimestamp="2026-04-24 16:39:08 +0000 UTC" firstStartedPulling="2026-04-24 16:39:10.73282027 +0000 UTC m=+3.190134137" lastFinishedPulling="2026-04-24 16:39:42.227545782 +0000 UTC m=+34.684859631" observedRunningTime="2026-04-24 16:39:45.403916398 +0000 UTC m=+37.861230280" watchObservedRunningTime="2026-04-24 16:39:45.40539446 +0000 UTC m=+37.862708332" Apr 24 16:39:45.418462 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:45.418417 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-j4684" podStartSLOduration=34.235056546 podStartE2EDuration="37.418405456s" podCreationTimestamp="2026-04-24 16:39:08 +0000 UTC" firstStartedPulling="2026-04-24 16:39:42.05604021 +0000 UTC m=+34.513354059" lastFinishedPulling="2026-04-24 16:39:45.23938911 +0000 UTC m=+37.696702969" observedRunningTime="2026-04-24 16:39:45.417669899 +0000 UTC m=+37.874983769" watchObservedRunningTime="2026-04-24 16:39:45.418405456 +0000 UTC m=+37.875719326" Apr 24 16:39:48.743537 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:48.743495 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert\") pod \"ingress-canary-zgxq7\" (UID: \"444f46c8-3c9a-4e72-8000-ca142ae511ef\") " pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:39:48.744028 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:48.743560 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:39:48.744028 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:48.743587 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:48.744028 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:48.743638 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:48.744028 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:48.743680 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:48.744028 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:48.743690 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:48.744028 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:48.743712 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert podName:444f46c8-3c9a-4e72-8000-ca142ae511ef nodeName:}" failed. No retries permitted until 2026-04-24 16:39:56.743691922 +0000 UTC m=+49.201005776 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert") pod "ingress-canary-zgxq7" (UID: "444f46c8-3c9a-4e72-8000-ca142ae511ef") : secret "canary-serving-cert" not found Apr 24 16:39:48.744028 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:48.743735 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls podName:bf7a70b6-9c3d-42f2-912b-ae46ee6adb21 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:56.743720908 +0000 UTC m=+49.201034760 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls") pod "dns-default-dj6nz" (UID: "bf7a70b6-9c3d-42f2-912b-ae46ee6adb21") : secret "dns-default-metrics-tls" not found Apr 24 16:39:48.744028 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:48.743693 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56d5dcfb77-vmzmz: secret "image-registry-tls" not found Apr 24 16:39:48.744028 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:48.743769 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls podName:e8f243a6-3111-4778-9ca9-092db5973836 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:56.743761906 +0000 UTC m=+49.201075757 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls") pod "image-registry-56d5dcfb77-vmzmz" (UID: "e8f243a6-3111-4778-9ca9-092db5973836") : secret "image-registry-tls" not found Apr 24 16:39:56.801850 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:56.801814 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:39:56.802247 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:56.801863 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert\") pod \"ingress-canary-zgxq7\" (UID: \"444f46c8-3c9a-4e72-8000-ca142ae511ef\") " pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:39:56.802247 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:56.801896 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:39:56.802247 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:56.801986 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:56.802247 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:56.802004 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:56.802247 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:56.802008 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56d5dcfb77-vmzmz: secret "image-registry-tls" not found Apr 24 16:39:56.802247 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:56.802027 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:56.802247 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:56.802058 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls podName:bf7a70b6-9c3d-42f2-912b-ae46ee6adb21 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:12.802044337 +0000 UTC m=+65.259358186 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls") pod "dns-default-dj6nz" (UID: "bf7a70b6-9c3d-42f2-912b-ae46ee6adb21") : secret "dns-default-metrics-tls" not found Apr 24 16:39:56.802247 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:56.802070 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls podName:e8f243a6-3111-4778-9ca9-092db5973836 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:12.802064899 +0000 UTC m=+65.259378747 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls") pod "image-registry-56d5dcfb77-vmzmz" (UID: "e8f243a6-3111-4778-9ca9-092db5973836") : secret "image-registry-tls" not found Apr 24 16:39:56.802247 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:39:56.802080 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert podName:444f46c8-3c9a-4e72-8000-ca142ae511ef nodeName:}" failed. No retries permitted until 2026-04-24 16:40:12.802074956 +0000 UTC m=+65.259388805 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert") pod "ingress-canary-zgxq7" (UID: "444f46c8-3c9a-4e72-8000-ca142ae511ef") : secret "canary-serving-cert" not found Apr 24 16:39:57.739391 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:57.739354 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6"] Apr 24 16:39:57.750952 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:57.750917 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6" Apr 24 16:39:57.753496 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:57.753473 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 16:39:57.753496 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:57.753484 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 16:39:57.754191 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:57.754171 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6"] Apr 24 16:39:57.754518 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:57.754496 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-crlgq\"" Apr 24 16:39:57.754643 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:57.754502 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 16:39:57.754643 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:57.754500 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 16:39:57.911392 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:57.911349 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5e812eb2-480c-4a84-b382-d7c26bd0da17-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-667fffb569-r7fw6\" (UID: \"5e812eb2-480c-4a84-b382-d7c26bd0da17\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6" Apr 24 16:39:57.911742 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:57.911405 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcmmc\" (UniqueName: \"kubernetes.io/projected/5e812eb2-480c-4a84-b382-d7c26bd0da17-kube-api-access-fcmmc\") pod \"managed-serviceaccount-addon-agent-667fffb569-r7fw6\" (UID: \"5e812eb2-480c-4a84-b382-d7c26bd0da17\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6" Apr 24 16:39:58.012686 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:58.012616 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5e812eb2-480c-4a84-b382-d7c26bd0da17-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-667fffb569-r7fw6\" (UID: \"5e812eb2-480c-4a84-b382-d7c26bd0da17\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6" Apr 24 16:39:58.012686 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:58.012665 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcmmc\" (UniqueName: \"kubernetes.io/projected/5e812eb2-480c-4a84-b382-d7c26bd0da17-kube-api-access-fcmmc\") pod \"managed-serviceaccount-addon-agent-667fffb569-r7fw6\" (UID: \"5e812eb2-480c-4a84-b382-d7c26bd0da17\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6" Apr 24 16:39:58.021328 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:58.021297 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcmmc\" (UniqueName: \"kubernetes.io/projected/5e812eb2-480c-4a84-b382-d7c26bd0da17-kube-api-access-fcmmc\") pod \"managed-serviceaccount-addon-agent-667fffb569-r7fw6\" (UID: \"5e812eb2-480c-4a84-b382-d7c26bd0da17\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6" Apr 24 16:39:58.027816 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:58.027794 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5e812eb2-480c-4a84-b382-d7c26bd0da17-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-667fffb569-r7fw6\" (UID: \"5e812eb2-480c-4a84-b382-d7c26bd0da17\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6" Apr 24 16:39:58.074046 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:58.074017 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6" Apr 24 16:39:58.192498 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:58.192470 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6"] Apr 24 16:39:58.195507 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:39:58.195474 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e812eb2_480c_4a84_b382_d7c26bd0da17.slice/crio-575a17bac55791b673b42e66f3bbe1e5c566e35c45602b278fe4e7b78d4e545e WatchSource:0}: Error finding container 575a17bac55791b673b42e66f3bbe1e5c566e35c45602b278fe4e7b78d4e545e: Status 404 returned error can't find the container with id 575a17bac55791b673b42e66f3bbe1e5c566e35c45602b278fe4e7b78d4e545e Apr 24 16:39:58.414178 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:39:58.414089 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6" event={"ID":"5e812eb2-480c-4a84-b382-d7c26bd0da17","Type":"ContainerStarted","Data":"575a17bac55791b673b42e66f3bbe1e5c566e35c45602b278fe4e7b78d4e545e"} Apr 24 16:40:00.418634 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:40:00.418603 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6" event={"ID":"5e812eb2-480c-4a84-b382-d7c26bd0da17","Type":"ContainerStarted","Data":"75344998c22843bc3d0cd65de3c82e582673116e7c5fd3b97a1d05e1e4fdf0ea"} Apr 24 16:40:00.436462 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:40:00.436377 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6" podStartSLOduration=1.2850898960000001 podStartE2EDuration="3.436360584s" podCreationTimestamp="2026-04-24 16:39:57 +0000 UTC" firstStartedPulling="2026-04-24 16:39:58.197318913 +0000 UTC m=+50.654632762" lastFinishedPulling="2026-04-24 16:40:00.348589587 +0000 UTC m=+52.805903450" observedRunningTime="2026-04-24 16:40:00.435768457 +0000 UTC m=+52.893082332" watchObservedRunningTime="2026-04-24 16:40:00.436360584 +0000 UTC m=+52.893674455" Apr 24 16:40:05.360415 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:40:05.360384 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gtxl5" Apr 24 16:40:12.816971 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:40:12.816915 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:40:12.817431 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:40:12.816990 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:40:12.817431 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:40:12.817034 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert\") pod \"ingress-canary-zgxq7\" (UID: \"444f46c8-3c9a-4e72-8000-ca142ae511ef\") " pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:40:12.817431 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:40:12.817080 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:40:12.817431 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:40:12.817121 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:40:12.817431 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:40:12.817139 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56d5dcfb77-vmzmz: secret "image-registry-tls" not found Apr 24 16:40:12.817431 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:40:12.817148 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:40:12.817431 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:40:12.817172 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls podName:bf7a70b6-9c3d-42f2-912b-ae46ee6adb21 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:44.817151711 +0000 UTC m=+97.274465574 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls") pod "dns-default-dj6nz" (UID: "bf7a70b6-9c3d-42f2-912b-ae46ee6adb21") : secret "dns-default-metrics-tls" not found Apr 24 16:40:12.817431 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:40:12.817196 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert podName:444f46c8-3c9a-4e72-8000-ca142ae511ef nodeName:}" failed. No retries permitted until 2026-04-24 16:40:44.817180923 +0000 UTC m=+97.274494773 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert") pod "ingress-canary-zgxq7" (UID: "444f46c8-3c9a-4e72-8000-ca142ae511ef") : secret "canary-serving-cert" not found Apr 24 16:40:12.817431 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:40:12.817213 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls podName:e8f243a6-3111-4778-9ca9-092db5973836 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:44.817203885 +0000 UTC m=+97.274517741 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls") pod "image-registry-56d5dcfb77-vmzmz" (UID: "e8f243a6-3111-4778-9ca9-092db5973836") : secret "image-registry-tls" not found Apr 24 16:40:13.824858 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:40:13.824814 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs\") pod \"network-metrics-daemon-wp5zn\" (UID: \"93f47728-cffc-4da9-9791-92c3d70ac2d2\") " pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:40:13.825284 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:40:13.824962 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 16:40:13.825284 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:40:13.825019 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs podName:93f47728-cffc-4da9-9791-92c3d70ac2d2 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:17.825004668 +0000 UTC m=+130.282318517 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs") pod "network-metrics-daemon-wp5zn" (UID: "93f47728-cffc-4da9-9791-92c3d70ac2d2") : secret "metrics-daemon-secret" not found Apr 24 16:40:16.388885 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:40:16.388850 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-j4684" Apr 24 16:40:44.843231 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:40:44.843188 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert\") pod \"ingress-canary-zgxq7\" (UID: \"444f46c8-3c9a-4e72-8000-ca142ae511ef\") " pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:40:44.843592 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:40:44.843253 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:40:44.843592 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:40:44.843279 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:40:44.843592 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:40:44.843350 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:40:44.843592 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:40:44.843387 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:40:44.843592 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:40:44.843364 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:40:44.843592 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:40:44.843428 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56d5dcfb77-vmzmz: secret "image-registry-tls" not found Apr 24 16:40:44.843592 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:40:44.843431 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert podName:444f46c8-3c9a-4e72-8000-ca142ae511ef nodeName:}" failed. No retries permitted until 2026-04-24 16:41:48.843410393 +0000 UTC m=+161.300724246 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert") pod "ingress-canary-zgxq7" (UID: "444f46c8-3c9a-4e72-8000-ca142ae511ef") : secret "canary-serving-cert" not found Apr 24 16:40:44.843592 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:40:44.843449 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls podName:bf7a70b6-9c3d-42f2-912b-ae46ee6adb21 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:48.843440546 +0000 UTC m=+161.300754401 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls") pod "dns-default-dj6nz" (UID: "bf7a70b6-9c3d-42f2-912b-ae46ee6adb21") : secret "dns-default-metrics-tls" not found Apr 24 16:40:44.843592 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:40:44.843463 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls podName:e8f243a6-3111-4778-9ca9-092db5973836 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:48.843455897 +0000 UTC m=+161.300769748 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls") pod "image-registry-56d5dcfb77-vmzmz" (UID: "e8f243a6-3111-4778-9ca9-092db5973836") : secret "image-registry-tls" not found Apr 24 16:41:17.876220 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:17.876162 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs\") pod \"network-metrics-daemon-wp5zn\" (UID: \"93f47728-cffc-4da9-9791-92c3d70ac2d2\") " pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:41:17.876714 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:17.876313 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 16:41:17.876714 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:17.876390 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs podName:93f47728-cffc-4da9-9791-92c3d70ac2d2 nodeName:}" failed. No retries permitted until 2026-04-24 16:43:19.876373996 +0000 UTC m=+252.333687850 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs") pod "network-metrics-daemon-wp5zn" (UID: "93f47728-cffc-4da9-9791-92c3d70ac2d2") : secret "metrics-daemon-secret" not found Apr 24 16:41:36.340014 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.339967 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m"] Apr 24 16:41:36.342029 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.342012 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-99ws4"] Apr 24 16:41:36.342176 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.342159 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m" Apr 24 16:41:36.346028 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.345188 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 16:41:36.346028 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.345286 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" Apr 24 16:41:36.346028 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.345371 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 16:41:36.346028 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.345457 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:41:36.346028 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.345916 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-6br5m\"" Apr 24 16:41:36.351167 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.351148 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 16:41:36.351473 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.351451 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-x7p7g\"" Apr 24 16:41:36.351574 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.351451 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 16:41:36.351778 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.351756 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 16:41:36.352602 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.352447 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:41:36.356897 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.356873 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-99ws4"] Apr 24 16:41:36.357982 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.357953 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m"] Apr 24 16:41:36.361288 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.361266 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 16:41:36.482098 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.482065 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-stqch"] Apr 24 16:41:36.483970 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.483927 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-stqch" Apr 24 16:41:36.485463 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.485441 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-d8c574-57dx2"] Apr 24 16:41:36.487832 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.487816 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:36.509576 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.509555 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqnbv\" (UniqueName: \"kubernetes.io/projected/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-kube-api-access-nqnbv\") pod \"cluster-samples-operator-6dc5bdb6b4-vv76m\" (UID: \"8fb63bb2-c5b7-4326-9e09-94fdeae1f646\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m" Apr 24 16:41:36.509692 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.509600 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7235e10c-761b-4d2d-a4f9-2d8114898c5d-serving-cert\") pod \"console-operator-9d4b6777b-99ws4\" (UID: \"7235e10c-761b-4d2d-a4f9-2d8114898c5d\") " pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" Apr 24 16:41:36.509692 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.509618 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7235e10c-761b-4d2d-a4f9-2d8114898c5d-trusted-ca\") pod \"console-operator-9d4b6777b-99ws4\" (UID: \"7235e10c-761b-4d2d-a4f9-2d8114898c5d\") " pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" Apr 24 16:41:36.509805 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.509731 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cth9n\" (UniqueName: \"kubernetes.io/projected/7235e10c-761b-4d2d-a4f9-2d8114898c5d-kube-api-access-cth9n\") pod \"console-operator-9d4b6777b-99ws4\" (UID: \"7235e10c-761b-4d2d-a4f9-2d8114898c5d\") " pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" Apr 24 16:41:36.509859 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.509806 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7235e10c-761b-4d2d-a4f9-2d8114898c5d-config\") pod \"console-operator-9d4b6777b-99ws4\" (UID: \"7235e10c-761b-4d2d-a4f9-2d8114898c5d\") " pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" Apr 24 16:41:36.509859 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.509840 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vv76m\" (UID: \"8fb63bb2-c5b7-4326-9e09-94fdeae1f646\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m" Apr 24 16:41:36.525691 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.525662 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:41:36.525844 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.525829 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 16:41:36.530751 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:36.530725 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-143-144.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ingress\": no relationship found between node 'ip-10-0-143-144.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 24 16:41:36.530853 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:36.530816 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:ip-10-0-143-144.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ingress\": no relationship found between node 'ip-10-0-143-144.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" type="*v1.ConfigMap" Apr 24 16:41:36.530853 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:36.530821 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"router-stats-default\" is forbidden: User \"system:node:ip-10-0-143-144.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ingress\": no relationship found between node 'ip-10-0-143-144.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" type="*v1.Secret" Apr 24 16:41:36.548452 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:36.548421 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"default-ingress-cert\" is forbidden: User \"system:node:ip-10-0-143-144.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ingress\": no relationship found between node 'ip-10-0-143-144.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" type="*v1.Secret" Apr 24 16:41:36.571999 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.571973 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-sfpll\"" Apr 24 16:41:36.584808 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:36.584780 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"router-dockercfg-79wnh\" is forbidden: User \"system:node:ip-10-0-143-144.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ingress\": no relationship found between node 'ip-10-0-143-144.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-ingress\"/\"router-dockercfg-79wnh\"" type="*v1.Secret" Apr 24 16:41:36.584808 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:36.584780 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"service-ca-bundle\" is forbidden: User \"system:node:ip-10-0-143-144.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ingress\": no relationship found between node 'ip-10-0-143-144.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" type="*v1.ConfigMap" Apr 24 16:41:36.584969 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:36.584782 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"router-metrics-certs-default\" is forbidden: User \"system:node:ip-10-0-143-144.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ingress\": no relationship found between node 'ip-10-0-143-144.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" type="*v1.Secret" Apr 24 16:41:36.603060 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.602990 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-z62vj"] Apr 24 16:41:36.604845 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.604827 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-stqch"] Apr 24 16:41:36.604955 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.604851 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w"] Apr 24 16:41:36.604997 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.604985 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-z62vj" Apr 24 16:41:36.606813 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.606795 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-d8c574-57dx2"] Apr 24 16:41:36.606919 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.606889 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" Apr 24 16:41:36.607592 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.607571 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-dhpbl\"" Apr 24 16:41:36.609828 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.609810 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 16:41:36.610156 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.610139 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7235e10c-761b-4d2d-a4f9-2d8114898c5d-config\") pod \"console-operator-9d4b6777b-99ws4\" (UID: \"7235e10c-761b-4d2d-a4f9-2d8114898c5d\") " pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" Apr 24 16:41:36.610223 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.610170 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-stats-auth\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:36.610223 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.610189 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vv76m\" (UID: \"8fb63bb2-c5b7-4326-9e09-94fdeae1f646\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m" Apr 24 16:41:36.610223 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.610206 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-default-certificate\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:36.610378 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:36.610296 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 16:41:36.610378 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.610293 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqnbv\" (UniqueName: \"kubernetes.io/projected/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-kube-api-access-nqnbv\") pod \"cluster-samples-operator-6dc5bdb6b4-vv76m\" (UID: \"8fb63bb2-c5b7-4326-9e09-94fdeae1f646\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m" Apr 24 16:41:36.610378 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:36.610348 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-samples-operator-tls podName:8fb63bb2-c5b7-4326-9e09-94fdeae1f646 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:37.110331065 +0000 UTC m=+149.567644918 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vv76m" (UID: "8fb63bb2-c5b7-4326-9e09-94fdeae1f646") : secret "samples-operator-tls" not found Apr 24 16:41:36.610527 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.610383 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-metrics-certs\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:36.610527 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.610453 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7235e10c-761b-4d2d-a4f9-2d8114898c5d-serving-cert\") pod \"console-operator-9d4b6777b-99ws4\" (UID: \"7235e10c-761b-4d2d-a4f9-2d8114898c5d\") " pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" Apr 24 16:41:36.610527 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.610478 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7235e10c-761b-4d2d-a4f9-2d8114898c5d-trusted-ca\") pod \"console-operator-9d4b6777b-99ws4\" (UID: \"7235e10c-761b-4d2d-a4f9-2d8114898c5d\") " pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" Apr 24 16:41:36.610527 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.610519 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04503171-24fb-471c-9dcc-be6c1d1b3331-service-ca-bundle\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:36.610720 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.610547 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bslpc\" (UniqueName: \"kubernetes.io/projected/04503171-24fb-471c-9dcc-be6c1d1b3331-kube-api-access-bslpc\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:36.610720 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.610598 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cth9n\" (UniqueName: \"kubernetes.io/projected/7235e10c-761b-4d2d-a4f9-2d8114898c5d-kube-api-access-cth9n\") pod \"console-operator-9d4b6777b-99ws4\" (UID: \"7235e10c-761b-4d2d-a4f9-2d8114898c5d\") " pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" Apr 24 16:41:36.610720 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.610642 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp9n7\" (UniqueName: \"kubernetes.io/projected/3371be78-94e6-4a3b-98b2-9aaad783afa5-kube-api-access-sp9n7\") pod \"volume-data-source-validator-7c6cbb6c87-stqch\" (UID: \"3371be78-94e6-4a3b-98b2-9aaad783afa5\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-stqch" Apr 24 16:41:36.610866 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.610851 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7235e10c-761b-4d2d-a4f9-2d8114898c5d-config\") pod \"console-operator-9d4b6777b-99ws4\" (UID: \"7235e10c-761b-4d2d-a4f9-2d8114898c5d\") " pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" Apr 24 16:41:36.611376 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.611154 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 16:41:36.611376 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.611158 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 16:41:36.611376 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.611294 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-rd7fz\"" Apr 24 16:41:36.611376 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.611368 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 16:41:36.611623 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.611395 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7235e10c-761b-4d2d-a4f9-2d8114898c5d-trusted-ca\") pod \"console-operator-9d4b6777b-99ws4\" (UID: \"7235e10c-761b-4d2d-a4f9-2d8114898c5d\") " pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" Apr 24 16:41:36.613372 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.613350 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7235e10c-761b-4d2d-a4f9-2d8114898c5d-serving-cert\") pod \"console-operator-9d4b6777b-99ws4\" (UID: \"7235e10c-761b-4d2d-a4f9-2d8114898c5d\") " pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" Apr 24 16:41:36.628294 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.628271 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w"] Apr 24 16:41:36.629475 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.629453 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-z62vj"] Apr 24 16:41:36.633027 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.633002 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cth9n\" (UniqueName: \"kubernetes.io/projected/7235e10c-761b-4d2d-a4f9-2d8114898c5d-kube-api-access-cth9n\") pod \"console-operator-9d4b6777b-99ws4\" (UID: \"7235e10c-761b-4d2d-a4f9-2d8114898c5d\") " pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" Apr 24 16:41:36.633239 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.633219 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqnbv\" (UniqueName: \"kubernetes.io/projected/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-kube-api-access-nqnbv\") pod \"cluster-samples-operator-6dc5bdb6b4-vv76m\" (UID: \"8fb63bb2-c5b7-4326-9e09-94fdeae1f646\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m" Apr 24 16:41:36.634673 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.634654 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6"] Apr 24 16:41:36.637151 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.637134 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6" Apr 24 16:41:36.639542 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.639520 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 16:41:36.640117 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.640093 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 16:41:36.640224 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.640130 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:41:36.640224 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.640173 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 16:41:36.643504 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.641540 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-2dtr6\"" Apr 24 16:41:36.645601 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.645576 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf"] Apr 24 16:41:36.647667 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.647651 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-jltdt"] Apr 24 16:41:36.647805 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.647791 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf" Apr 24 16:41:36.649888 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.649868 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.650233 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.650215 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:41:36.650338 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.650318 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 16:41:36.650415 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.650349 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 16:41:36.650415 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.650372 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 16:41:36.650518 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.650317 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-7hth9\"" Apr 24 16:41:36.650571 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.650531 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6"] Apr 24 16:41:36.652195 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.652174 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 16:41:36.652324 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.652282 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-kbj5t\"" Apr 24 16:41:36.652886 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.652868 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 16:41:36.653004 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.652925 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 16:41:36.653399 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.653383 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 16:41:36.659784 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.659762 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 16:41:36.661782 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.661765 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" Apr 24 16:41:36.666471 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.666448 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-jltdt"] Apr 24 16:41:36.667326 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.667306 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf"] Apr 24 16:41:36.712137 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.712069 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04503171-24fb-471c-9dcc-be6c1d1b3331-service-ca-bundle\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:36.712137 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.712121 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6kdn6\" (UID: \"e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6" Apr 24 16:41:36.712328 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.712164 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sp9n7\" (UniqueName: \"kubernetes.io/projected/3371be78-94e6-4a3b-98b2-9aaad783afa5-kube-api-access-sp9n7\") pod \"volume-data-source-validator-7c6cbb6c87-stqch\" (UID: \"3371be78-94e6-4a3b-98b2-9aaad783afa5\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-stqch" Apr 24 16:41:36.712328 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.712197 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-default-certificate\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:36.712328 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.712250 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6kdn6\" (UID: \"e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6" Apr 24 16:41:36.712328 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.712286 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-metrics-certs\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:36.712328 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.712315 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzfch\" (UniqueName: \"kubernetes.io/projected/e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9-kube-api-access-lzfch\") pod \"kube-storage-version-migrator-operator-6769c5d45-6kdn6\" (UID: \"e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6" Apr 24 16:41:36.712571 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.712371 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bslpc\" (UniqueName: \"kubernetes.io/projected/04503171-24fb-471c-9dcc-be6c1d1b3331-kube-api-access-bslpc\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:36.712571 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.712400 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh96h\" (UniqueName: \"kubernetes.io/projected/b822941a-294c-4710-89a8-5ee93dbb2e7c-kube-api-access-bh96h\") pod \"network-check-source-8894fc9bd-z62vj\" (UID: \"b822941a-294c-4710-89a8-5ee93dbb2e7c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-z62vj" Apr 24 16:41:36.712571 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.712476 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-stats-auth\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:36.712571 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.712520 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/cc0177b3-ada2-4478-8a4f-354de52d1414-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-n5n4w\" (UID: \"cc0177b3-ada2-4478-8a4f-354de52d1414\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" Apr 24 16:41:36.712571 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.712561 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n5n4w\" (UID: \"cc0177b3-ada2-4478-8a4f-354de52d1414\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" Apr 24 16:41:36.712811 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.712589 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzqvk\" (UniqueName: \"kubernetes.io/projected/cc0177b3-ada2-4478-8a4f-354de52d1414-kube-api-access-mzqvk\") pod \"cluster-monitoring-operator-75587bd455-n5n4w\" (UID: \"cc0177b3-ada2-4478-8a4f-354de52d1414\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" Apr 24 16:41:36.722593 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.722563 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp9n7\" (UniqueName: \"kubernetes.io/projected/3371be78-94e6-4a3b-98b2-9aaad783afa5-kube-api-access-sp9n7\") pod \"volume-data-source-validator-7c6cbb6c87-stqch\" (UID: \"3371be78-94e6-4a3b-98b2-9aaad783afa5\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-stqch" Apr 24 16:41:36.782182 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.782152 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-99ws4"] Apr 24 16:41:36.784845 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:41:36.784818 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7235e10c_761b_4d2d_a4f9_2d8114898c5d.slice/crio-03d62ad1b3dca54d0ca1bb4c8eda7c426d9c19a8bbc16ce806deb08fea70bc36 WatchSource:0}: Error finding container 03d62ad1b3dca54d0ca1bb4c8eda7c426d9c19a8bbc16ce806deb08fea70bc36: Status 404 returned error can't find the container with id 03d62ad1b3dca54d0ca1bb4c8eda7c426d9c19a8bbc16ce806deb08fea70bc36 Apr 24 16:41:36.793492 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.793459 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-stqch" Apr 24 16:41:36.813585 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.813557 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0708a89-9e11-49df-97d8-8cbf72ad25dd-serving-cert\") pod \"service-ca-operator-d6fc45fc5-nwdmf\" (UID: \"d0708a89-9e11-49df-97d8-8cbf72ad25dd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf" Apr 24 16:41:36.813710 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.813610 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6kdn6\" (UID: \"e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6" Apr 24 16:41:36.813710 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.813676 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzfch\" (UniqueName: \"kubernetes.io/projected/e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9-kube-api-access-lzfch\") pod \"kube-storage-version-migrator-operator-6769c5d45-6kdn6\" (UID: \"e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6" Apr 24 16:41:36.813828 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.813716 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckts5\" (UniqueName: \"kubernetes.io/projected/d0708a89-9e11-49df-97d8-8cbf72ad25dd-kube-api-access-ckts5\") pod \"service-ca-operator-d6fc45fc5-nwdmf\" (UID: \"d0708a89-9e11-49df-97d8-8cbf72ad25dd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf" Apr 24 16:41:36.813828 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.813763 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh96h\" (UniqueName: \"kubernetes.io/projected/b822941a-294c-4710-89a8-5ee93dbb2e7c-kube-api-access-bh96h\") pod \"network-check-source-8894fc9bd-z62vj\" (UID: \"b822941a-294c-4710-89a8-5ee93dbb2e7c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-z62vj" Apr 24 16:41:36.813828 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.813802 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc664fe7-8e82-414a-88b1-9faec08dd651-service-ca-bundle\") pod \"insights-operator-585dfdc468-jltdt\" (UID: \"bc664fe7-8e82-414a-88b1-9faec08dd651\") " pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.814002 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.813851 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/cc0177b3-ada2-4478-8a4f-354de52d1414-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-n5n4w\" (UID: \"cc0177b3-ada2-4478-8a4f-354de52d1414\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" Apr 24 16:41:36.814002 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.813883 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n5n4w\" (UID: \"cc0177b3-ada2-4478-8a4f-354de52d1414\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" Apr 24 16:41:36.814002 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.813910 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzqvk\" (UniqueName: \"kubernetes.io/projected/cc0177b3-ada2-4478-8a4f-354de52d1414-kube-api-access-mzqvk\") pod \"cluster-monitoring-operator-75587bd455-n5n4w\" (UID: \"cc0177b3-ada2-4478-8a4f-354de52d1414\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" Apr 24 16:41:36.814002 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.813953 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc664fe7-8e82-414a-88b1-9faec08dd651-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-jltdt\" (UID: \"bc664fe7-8e82-414a-88b1-9faec08dd651\") " pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.814184 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:36.813998 2562 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:36.814184 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:36.814061 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls podName:cc0177b3-ada2-4478-8a4f-354de52d1414 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:37.314040499 +0000 UTC m=+149.771354352 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-n5n4w" (UID: "cc0177b3-ada2-4478-8a4f-354de52d1414") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:36.814184 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.814128 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bc664fe7-8e82-414a-88b1-9faec08dd651-snapshots\") pod \"insights-operator-585dfdc468-jltdt\" (UID: \"bc664fe7-8e82-414a-88b1-9faec08dd651\") " pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.814184 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.814156 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc664fe7-8e82-414a-88b1-9faec08dd651-serving-cert\") pod \"insights-operator-585dfdc468-jltdt\" (UID: \"bc664fe7-8e82-414a-88b1-9faec08dd651\") " pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.814380 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.814243 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0708a89-9e11-49df-97d8-8cbf72ad25dd-config\") pod \"service-ca-operator-d6fc45fc5-nwdmf\" (UID: \"d0708a89-9e11-49df-97d8-8cbf72ad25dd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf" Apr 24 16:41:36.814380 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.814271 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dclzg\" (UniqueName: \"kubernetes.io/projected/bc664fe7-8e82-414a-88b1-9faec08dd651-kube-api-access-dclzg\") pod \"insights-operator-585dfdc468-jltdt\" (UID: \"bc664fe7-8e82-414a-88b1-9faec08dd651\") " pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.814380 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.814331 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc664fe7-8e82-414a-88b1-9faec08dd651-tmp\") pod \"insights-operator-585dfdc468-jltdt\" (UID: \"bc664fe7-8e82-414a-88b1-9faec08dd651\") " pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.814380 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.814354 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6kdn6\" (UID: \"e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6" Apr 24 16:41:36.814692 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.814669 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/cc0177b3-ada2-4478-8a4f-354de52d1414-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-n5n4w\" (UID: \"cc0177b3-ada2-4478-8a4f-354de52d1414\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" Apr 24 16:41:36.814761 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.814744 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6kdn6\" (UID: \"e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6" Apr 24 16:41:36.815885 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.815865 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6kdn6\" (UID: \"e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6" Apr 24 16:41:36.824892 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.824870 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzfch\" (UniqueName: \"kubernetes.io/projected/e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9-kube-api-access-lzfch\") pod \"kube-storage-version-migrator-operator-6769c5d45-6kdn6\" (UID: \"e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6" Apr 24 16:41:36.826851 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.826830 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzqvk\" (UniqueName: \"kubernetes.io/projected/cc0177b3-ada2-4478-8a4f-354de52d1414-kube-api-access-mzqvk\") pod \"cluster-monitoring-operator-75587bd455-n5n4w\" (UID: \"cc0177b3-ada2-4478-8a4f-354de52d1414\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" Apr 24 16:41:36.827221 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.827205 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh96h\" (UniqueName: \"kubernetes.io/projected/b822941a-294c-4710-89a8-5ee93dbb2e7c-kube-api-access-bh96h\") pod \"network-check-source-8894fc9bd-z62vj\" (UID: \"b822941a-294c-4710-89a8-5ee93dbb2e7c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-z62vj" Apr 24 16:41:36.915739 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.915695 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc664fe7-8e82-414a-88b1-9faec08dd651-service-ca-bundle\") pod \"insights-operator-585dfdc468-jltdt\" (UID: \"bc664fe7-8e82-414a-88b1-9faec08dd651\") " pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.915891 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.915758 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc664fe7-8e82-414a-88b1-9faec08dd651-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-jltdt\" (UID: \"bc664fe7-8e82-414a-88b1-9faec08dd651\") " pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.915891 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.915784 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bc664fe7-8e82-414a-88b1-9faec08dd651-snapshots\") pod \"insights-operator-585dfdc468-jltdt\" (UID: \"bc664fe7-8e82-414a-88b1-9faec08dd651\") " pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.915891 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.915800 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc664fe7-8e82-414a-88b1-9faec08dd651-serving-cert\") pod \"insights-operator-585dfdc468-jltdt\" (UID: \"bc664fe7-8e82-414a-88b1-9faec08dd651\") " pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.916039 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.915962 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0708a89-9e11-49df-97d8-8cbf72ad25dd-config\") pod \"service-ca-operator-d6fc45fc5-nwdmf\" (UID: \"d0708a89-9e11-49df-97d8-8cbf72ad25dd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf" Apr 24 16:41:36.916039 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.915995 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dclzg\" (UniqueName: \"kubernetes.io/projected/bc664fe7-8e82-414a-88b1-9faec08dd651-kube-api-access-dclzg\") pod \"insights-operator-585dfdc468-jltdt\" (UID: \"bc664fe7-8e82-414a-88b1-9faec08dd651\") " pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.916039 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.916025 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc664fe7-8e82-414a-88b1-9faec08dd651-tmp\") pod \"insights-operator-585dfdc468-jltdt\" (UID: \"bc664fe7-8e82-414a-88b1-9faec08dd651\") " pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.916666 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.916240 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0708a89-9e11-49df-97d8-8cbf72ad25dd-serving-cert\") pod \"service-ca-operator-d6fc45fc5-nwdmf\" (UID: \"d0708a89-9e11-49df-97d8-8cbf72ad25dd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf" Apr 24 16:41:36.916666 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.916326 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckts5\" (UniqueName: \"kubernetes.io/projected/d0708a89-9e11-49df-97d8-8cbf72ad25dd-kube-api-access-ckts5\") pod \"service-ca-operator-d6fc45fc5-nwdmf\" (UID: \"d0708a89-9e11-49df-97d8-8cbf72ad25dd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf" Apr 24 16:41:36.916666 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.916373 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc664fe7-8e82-414a-88b1-9faec08dd651-service-ca-bundle\") pod \"insights-operator-585dfdc468-jltdt\" (UID: \"bc664fe7-8e82-414a-88b1-9faec08dd651\") " pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.916666 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.916494 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc664fe7-8e82-414a-88b1-9faec08dd651-tmp\") pod \"insights-operator-585dfdc468-jltdt\" (UID: \"bc664fe7-8e82-414a-88b1-9faec08dd651\") " pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.916666 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.916497 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bc664fe7-8e82-414a-88b1-9faec08dd651-snapshots\") pod \"insights-operator-585dfdc468-jltdt\" (UID: \"bc664fe7-8e82-414a-88b1-9faec08dd651\") " pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.916666 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.916601 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0708a89-9e11-49df-97d8-8cbf72ad25dd-config\") pod \"service-ca-operator-d6fc45fc5-nwdmf\" (UID: \"d0708a89-9e11-49df-97d8-8cbf72ad25dd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf" Apr 24 16:41:36.916952 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.916856 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc664fe7-8e82-414a-88b1-9faec08dd651-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-jltdt\" (UID: \"bc664fe7-8e82-414a-88b1-9faec08dd651\") " pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.918225 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.918201 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc664fe7-8e82-414a-88b1-9faec08dd651-serving-cert\") pod \"insights-operator-585dfdc468-jltdt\" (UID: \"bc664fe7-8e82-414a-88b1-9faec08dd651\") " pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.918318 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.918284 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0708a89-9e11-49df-97d8-8cbf72ad25dd-serving-cert\") pod \"service-ca-operator-d6fc45fc5-nwdmf\" (UID: \"d0708a89-9e11-49df-97d8-8cbf72ad25dd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf" Apr 24 16:41:36.921036 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.921008 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-z62vj" Apr 24 16:41:36.921257 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.921236 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-stqch"] Apr 24 16:41:36.925581 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:41:36.925560 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3371be78_94e6_4a3b_98b2_9aaad783afa5.slice/crio-729b820083ee2d2fb1245985053d42a4d311da5c276c10110f49639e9e4c8c33 WatchSource:0}: Error finding container 729b820083ee2d2fb1245985053d42a4d311da5c276c10110f49639e9e4c8c33: Status 404 returned error can't find the container with id 729b820083ee2d2fb1245985053d42a4d311da5c276c10110f49639e9e4c8c33 Apr 24 16:41:36.932636 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.932613 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dclzg\" (UniqueName: \"kubernetes.io/projected/bc664fe7-8e82-414a-88b1-9faec08dd651-kube-api-access-dclzg\") pod \"insights-operator-585dfdc468-jltdt\" (UID: \"bc664fe7-8e82-414a-88b1-9faec08dd651\") " pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:36.933175 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.933154 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckts5\" (UniqueName: \"kubernetes.io/projected/d0708a89-9e11-49df-97d8-8cbf72ad25dd-kube-api-access-ckts5\") pod \"service-ca-operator-d6fc45fc5-nwdmf\" (UID: \"d0708a89-9e11-49df-97d8-8cbf72ad25dd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf" Apr 24 16:41:36.955443 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.955404 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6" Apr 24 16:41:36.961411 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.961383 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf" Apr 24 16:41:36.967454 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:36.967237 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-jltdt" Apr 24 16:41:37.066927 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:37.066900 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-z62vj"] Apr 24 16:41:37.070021 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:41:37.069992 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb822941a_294c_4710_89a8_5ee93dbb2e7c.slice/crio-5585345bfbf8a8034a11ec4597f73bedb26265eb37170de22cdbb902548b963b WatchSource:0}: Error finding container 5585345bfbf8a8034a11ec4597f73bedb26265eb37170de22cdbb902548b963b: Status 404 returned error can't find the container with id 5585345bfbf8a8034a11ec4597f73bedb26265eb37170de22cdbb902548b963b Apr 24 16:41:37.108906 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:37.108869 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6"] Apr 24 16:41:37.115116 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:41:37.115071 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0aba91b_576f_4ced_95dc_ed6bbd9e5cf9.slice/crio-e6204b948ffbb03cde682368db3b8d9f09a9e47cbab2cf4860ab592d0c50359d WatchSource:0}: Error finding container e6204b948ffbb03cde682368db3b8d9f09a9e47cbab2cf4860ab592d0c50359d: Status 404 returned error can't find the container with id e6204b948ffbb03cde682368db3b8d9f09a9e47cbab2cf4860ab592d0c50359d Apr 24 16:41:37.117653 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:37.117630 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vv76m\" (UID: \"8fb63bb2-c5b7-4326-9e09-94fdeae1f646\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m" Apr 24 16:41:37.117797 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:37.117774 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 16:41:37.117891 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:37.117832 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-samples-operator-tls podName:8fb63bb2-c5b7-4326-9e09-94fdeae1f646 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:38.117817033 +0000 UTC m=+150.575130886 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vv76m" (UID: "8fb63bb2-c5b7-4326-9e09-94fdeae1f646") : secret "samples-operator-tls" not found Apr 24 16:41:37.318922 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:37.318887 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n5n4w\" (UID: \"cc0177b3-ada2-4478-8a4f-354de52d1414\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" Apr 24 16:41:37.319132 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:37.319058 2562 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:37.319196 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:37.319142 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls podName:cc0177b3-ada2-4478-8a4f-354de52d1414 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:38.319119152 +0000 UTC m=+150.776433006 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-n5n4w" (UID: "cc0177b3-ada2-4478-8a4f-354de52d1414") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:37.342795 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:37.342761 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-jltdt"] Apr 24 16:41:37.344804 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:37.344759 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf"] Apr 24 16:41:37.345255 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:41:37.345224 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc664fe7_8e82_414a_88b1_9faec08dd651.slice/crio-8594450312ffe74bf4d44abdb3e7c71041b416495114287f6854567ac5d5a833 WatchSource:0}: Error finding container 8594450312ffe74bf4d44abdb3e7c71041b416495114287f6854567ac5d5a833: Status 404 returned error can't find the container with id 8594450312ffe74bf4d44abdb3e7c71041b416495114287f6854567ac5d5a833 Apr 24 16:41:37.348197 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:41:37.348160 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0708a89_9e11_49df_97d8_8cbf72ad25dd.slice/crio-0ba7858c3fc1942940cc127fdda7c8367367a8ed47403603b52db424f7847e6d WatchSource:0}: Error finding container 0ba7858c3fc1942940cc127fdda7c8367367a8ed47403603b52db424f7847e6d: Status 404 returned error can't find the container with id 0ba7858c3fc1942940cc127fdda7c8367367a8ed47403603b52db424f7847e6d Apr 24 16:41:37.547949 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:37.547703 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 16:41:37.553158 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:37.553130 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/04503171-24fb-471c-9dcc-be6c1d1b3331-service-ca-bundle podName:04503171-24fb-471c-9dcc-be6c1d1b3331 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:38.053103048 +0000 UTC m=+150.510416914 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/04503171-24fb-471c-9dcc-be6c1d1b3331-service-ca-bundle") pod "router-default-d8c574-57dx2" (UID: "04503171-24fb-471c-9dcc-be6c1d1b3331") : configmap references non-existent config key: service-ca.crt Apr 24 16:41:37.599817 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:37.599478 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-z62vj" event={"ID":"b822941a-294c-4710-89a8-5ee93dbb2e7c","Type":"ContainerStarted","Data":"515a22443d1ebc6f47d31ae6e9fb70023bab00852f7eaee5e889b2b2a6731255"} Apr 24 16:41:37.599817 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:37.599522 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-z62vj" event={"ID":"b822941a-294c-4710-89a8-5ee93dbb2e7c","Type":"ContainerStarted","Data":"5585345bfbf8a8034a11ec4597f73bedb26265eb37170de22cdbb902548b963b"} Apr 24 16:41:37.602138 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:37.602068 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6" event={"ID":"e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9","Type":"ContainerStarted","Data":"e6204b948ffbb03cde682368db3b8d9f09a9e47cbab2cf4860ab592d0c50359d"} Apr 24 16:41:37.605282 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:37.605168 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-stqch" event={"ID":"3371be78-94e6-4a3b-98b2-9aaad783afa5","Type":"ContainerStarted","Data":"729b820083ee2d2fb1245985053d42a4d311da5c276c10110f49639e9e4c8c33"} Apr 24 16:41:37.608718 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:37.608661 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-jltdt" event={"ID":"bc664fe7-8e82-414a-88b1-9faec08dd651","Type":"ContainerStarted","Data":"8594450312ffe74bf4d44abdb3e7c71041b416495114287f6854567ac5d5a833"} Apr 24 16:41:37.610094 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:37.610009 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf" event={"ID":"d0708a89-9e11-49df-97d8-8cbf72ad25dd","Type":"ContainerStarted","Data":"0ba7858c3fc1942940cc127fdda7c8367367a8ed47403603b52db424f7847e6d"} Apr 24 16:41:37.612685 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:37.612642 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" event={"ID":"7235e10c-761b-4d2d-a4f9-2d8114898c5d","Type":"ContainerStarted","Data":"03d62ad1b3dca54d0ca1bb4c8eda7c426d9c19a8bbc16ce806deb08fea70bc36"} Apr 24 16:41:37.653329 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:37.653040 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 16:41:37.713243 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:37.713168 2562 secret.go:189] Couldn't get secret openshift-ingress/default-ingress-cert: failed to sync secret cache: timed out waiting for the condition Apr 24 16:41:37.713424 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:37.713252 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-default-certificate podName:04503171-24fb-471c-9dcc-be6c1d1b3331 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:38.213227362 +0000 UTC m=+150.670541224 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-default-certificate") pod "router-default-d8c574-57dx2" (UID: "04503171-24fb-471c-9dcc-be6c1d1b3331") : failed to sync secret cache: timed out waiting for the condition Apr 24 16:41:37.713424 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:37.713290 2562 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Apr 24 16:41:37.713424 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:37.713324 2562 secret.go:189] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Apr 24 16:41:37.713424 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:37.713343 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-metrics-certs podName:04503171-24fb-471c-9dcc-be6c1d1b3331 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:38.213330025 +0000 UTC m=+150.670643885 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-metrics-certs") pod "router-default-d8c574-57dx2" (UID: "04503171-24fb-471c-9dcc-be6c1d1b3331") : failed to sync secret cache: timed out waiting for the condition Apr 24 16:41:37.713652 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:37.713433 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-stats-auth podName:04503171-24fb-471c-9dcc-be6c1d1b3331 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:38.21341123 +0000 UTC m=+150.670725084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-stats-auth") pod "router-default-d8c574-57dx2" (UID: "04503171-24fb-471c-9dcc-be6c1d1b3331") : failed to sync secret cache: timed out waiting for the condition Apr 24 16:41:37.720285 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:37.719971 2562 projected.go:289] Couldn't get configMap openshift-ingress/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Apr 24 16:41:37.720285 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:37.720011 2562 projected.go:194] Error preparing data for projected volume kube-api-access-bslpc for pod openshift-ingress/router-default-d8c574-57dx2: failed to sync configmap cache: timed out waiting for the condition Apr 24 16:41:37.720285 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:37.720076 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04503171-24fb-471c-9dcc-be6c1d1b3331-kube-api-access-bslpc podName:04503171-24fb-471c-9dcc-be6c1d1b3331 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:38.220058481 +0000 UTC m=+150.677372337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-bslpc" (UniqueName: "kubernetes.io/projected/04503171-24fb-471c-9dcc-be6c1d1b3331-kube-api-access-bslpc") pod "router-default-d8c574-57dx2" (UID: "04503171-24fb-471c-9dcc-be6c1d1b3331") : failed to sync configmap cache: timed out waiting for the condition Apr 24 16:41:37.875261 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:37.875178 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 16:41:37.944073 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:37.944042 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-79wnh\"" Apr 24 16:41:37.961320 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:37.961286 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 16:41:38.045676 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:38.045646 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 16:41:38.112645 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:38.112415 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 16:41:38.127326 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:38.127229 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04503171-24fb-471c-9dcc-be6c1d1b3331-service-ca-bundle\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:38.127484 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:38.127357 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vv76m\" (UID: \"8fb63bb2-c5b7-4326-9e09-94fdeae1f646\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m" Apr 24 16:41:38.127484 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:38.127469 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/04503171-24fb-471c-9dcc-be6c1d1b3331-service-ca-bundle podName:04503171-24fb-471c-9dcc-be6c1d1b3331 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:39.127446062 +0000 UTC m=+151.584759926 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/04503171-24fb-471c-9dcc-be6c1d1b3331-service-ca-bundle") pod "router-default-d8c574-57dx2" (UID: "04503171-24fb-471c-9dcc-be6c1d1b3331") : configmap references non-existent config key: service-ca.crt Apr 24 16:41:38.127609 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:38.127485 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 16:41:38.127609 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:38.127534 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-samples-operator-tls podName:8fb63bb2-c5b7-4326-9e09-94fdeae1f646 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:40.127517796 +0000 UTC m=+152.584831655 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vv76m" (UID: "8fb63bb2-c5b7-4326-9e09-94fdeae1f646") : secret "samples-operator-tls" not found Apr 24 16:41:38.196973 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:38.196896 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-z62vj" podStartSLOduration=2.196878964 podStartE2EDuration="2.196878964s" podCreationTimestamp="2026-04-24 16:41:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:41:37.619201882 +0000 UTC m=+150.076515754" watchObservedRunningTime="2026-04-24 16:41:38.196878964 +0000 UTC m=+150.654192836" Apr 24 16:41:38.228312 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:38.227968 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-metrics-certs\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:38.228312 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:38.228042 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bslpc\" (UniqueName: \"kubernetes.io/projected/04503171-24fb-471c-9dcc-be6c1d1b3331-kube-api-access-bslpc\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:38.228312 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:38.228101 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-stats-auth\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:38.228312 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:38.228169 2562 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 16:41:38.228312 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:38.228225 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-metrics-certs podName:04503171-24fb-471c-9dcc-be6c1d1b3331 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:39.228208589 +0000 UTC m=+151.685522461 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-metrics-certs") pod "router-default-d8c574-57dx2" (UID: "04503171-24fb-471c-9dcc-be6c1d1b3331") : secret "router-metrics-certs-default" not found Apr 24 16:41:38.228312 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:38.228377 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-default-certificate\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:38.231899 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:38.231844 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bslpc\" (UniqueName: \"kubernetes.io/projected/04503171-24fb-471c-9dcc-be6c1d1b3331-kube-api-access-bslpc\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:38.234130 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:38.233774 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-default-certificate\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:38.234525 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:38.234493 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-stats-auth\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:38.330119 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:38.329407 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n5n4w\" (UID: \"cc0177b3-ada2-4478-8a4f-354de52d1414\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" Apr 24 16:41:38.330119 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:38.329684 2562 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:38.330119 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:38.329750 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls podName:cc0177b3-ada2-4478-8a4f-354de52d1414 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:40.329730914 +0000 UTC m=+152.787044768 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-n5n4w" (UID: "cc0177b3-ada2-4478-8a4f-354de52d1414") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:39.138188 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:39.138141 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04503171-24fb-471c-9dcc-be6c1d1b3331-service-ca-bundle\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:39.138559 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:39.138325 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/04503171-24fb-471c-9dcc-be6c1d1b3331-service-ca-bundle podName:04503171-24fb-471c-9dcc-be6c1d1b3331 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:41.138306177 +0000 UTC m=+153.595620039 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/04503171-24fb-471c-9dcc-be6c1d1b3331-service-ca-bundle") pod "router-default-d8c574-57dx2" (UID: "04503171-24fb-471c-9dcc-be6c1d1b3331") : configmap references non-existent config key: service-ca.crt Apr 24 16:41:39.238801 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:39.238764 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-metrics-certs\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:39.238982 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:39.238913 2562 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 16:41:39.239047 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:39.239004 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-metrics-certs podName:04503171-24fb-471c-9dcc-be6c1d1b3331 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:41.238987904 +0000 UTC m=+153.696301757 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-metrics-certs") pod "router-default-d8c574-57dx2" (UID: "04503171-24fb-471c-9dcc-be6c1d1b3331") : secret "router-metrics-certs-default" not found Apr 24 16:41:40.146636 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:40.146594 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vv76m\" (UID: \"8fb63bb2-c5b7-4326-9e09-94fdeae1f646\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m" Apr 24 16:41:40.147136 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:40.146770 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 16:41:40.147136 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:40.146858 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-samples-operator-tls podName:8fb63bb2-c5b7-4326-9e09-94fdeae1f646 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:44.146835472 +0000 UTC m=+156.604149323 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vv76m" (UID: "8fb63bb2-c5b7-4326-9e09-94fdeae1f646") : secret "samples-operator-tls" not found Apr 24 16:41:40.348078 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:40.348027 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n5n4w\" (UID: \"cc0177b3-ada2-4478-8a4f-354de52d1414\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" Apr 24 16:41:40.348248 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:40.348200 2562 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:40.348298 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:40.348287 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls podName:cc0177b3-ada2-4478-8a4f-354de52d1414 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:44.348263833 +0000 UTC m=+156.805577683 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-n5n4w" (UID: "cc0177b3-ada2-4478-8a4f-354de52d1414") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:41.155110 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.155067 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04503171-24fb-471c-9dcc-be6c1d1b3331-service-ca-bundle\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:41.155475 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:41.155247 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/04503171-24fb-471c-9dcc-be6c1d1b3331-service-ca-bundle podName:04503171-24fb-471c-9dcc-be6c1d1b3331 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:45.15522524 +0000 UTC m=+157.612539098 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/04503171-24fb-471c-9dcc-be6c1d1b3331-service-ca-bundle") pod "router-default-d8c574-57dx2" (UID: "04503171-24fb-471c-9dcc-be6c1d1b3331") : configmap references non-existent config key: service-ca.crt Apr 24 16:41:41.256361 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.256320 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-metrics-certs\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:41.256562 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:41.256499 2562 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 16:41:41.256621 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:41.256580 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-metrics-certs podName:04503171-24fb-471c-9dcc-be6c1d1b3331 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:45.256559424 +0000 UTC m=+157.713873276 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-metrics-certs") pod "router-default-d8c574-57dx2" (UID: "04503171-24fb-471c-9dcc-be6c1d1b3331") : secret "router-metrics-certs-default" not found Apr 24 16:41:41.545837 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.545801 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-94ptf"] Apr 24 16:41:41.548279 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.548249 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-94ptf" Apr 24 16:41:41.551123 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.551099 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 16:41:41.551606 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.551588 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 16:41:41.554920 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.554743 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-g47rm\"" Apr 24 16:41:41.570797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.570768 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-94ptf"] Apr 24 16:41:41.624705 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.624660 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6" event={"ID":"e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9","Type":"ContainerStarted","Data":"40bc2f02878c2b5ef91a42fb883073cfe53856ac285a66227ac8a2e284d3ede8"} Apr 24 16:41:41.626651 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.626323 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-stqch" event={"ID":"3371be78-94e6-4a3b-98b2-9aaad783afa5","Type":"ContainerStarted","Data":"1fa637257d1a31aaeb92dbc7ddae74ebd9300e5433ba84c3c42706445fdb283c"} Apr 24 16:41:41.627987 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.627928 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-jltdt" event={"ID":"bc664fe7-8e82-414a-88b1-9faec08dd651","Type":"ContainerStarted","Data":"b4605385de9cf7f65fafeb6904ba99e7a16db1c6718bdf78e25d29a5906dd16a"} Apr 24 16:41:41.630794 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.630769 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf" event={"ID":"d0708a89-9e11-49df-97d8-8cbf72ad25dd","Type":"ContainerStarted","Data":"ade35193d9d605c2d96856b854db8da47c59b66bad823e9f133417b39307962a"} Apr 24 16:41:41.632178 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.632160 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-99ws4_7235e10c-761b-4d2d-a4f9-2d8114898c5d/console-operator/0.log" Apr 24 16:41:41.632271 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.632197 2562 generic.go:358] "Generic (PLEG): container finished" podID="7235e10c-761b-4d2d-a4f9-2d8114898c5d" containerID="4331ae6522973639f6a2719b213716893f17c72de9611d065c103834e8a3c1cb" exitCode=255 Apr 24 16:41:41.632271 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.632227 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" event={"ID":"7235e10c-761b-4d2d-a4f9-2d8114898c5d","Type":"ContainerDied","Data":"4331ae6522973639f6a2719b213716893f17c72de9611d065c103834e8a3c1cb"} Apr 24 16:41:41.632448 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.632433 2562 scope.go:117] "RemoveContainer" containerID="4331ae6522973639f6a2719b213716893f17c72de9611d065c103834e8a3c1cb" Apr 24 16:41:41.644094 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.643876 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6" podStartSLOduration=2.080607481 podStartE2EDuration="5.643861512s" podCreationTimestamp="2026-04-24 16:41:36 +0000 UTC" firstStartedPulling="2026-04-24 16:41:37.117100292 +0000 UTC m=+149.574414141" lastFinishedPulling="2026-04-24 16:41:40.680354313 +0000 UTC m=+153.137668172" observedRunningTime="2026-04-24 16:41:41.642831962 +0000 UTC m=+154.100145831" watchObservedRunningTime="2026-04-24 16:41:41.643861512 +0000 UTC m=+154.101175383" Apr 24 16:41:41.659474 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.659442 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr4cm\" (UniqueName: \"kubernetes.io/projected/d1584eff-fb10-45b7-979a-deabbfef9402-kube-api-access-wr4cm\") pod \"migrator-74bb7799d9-94ptf\" (UID: \"d1584eff-fb10-45b7-979a-deabbfef9402\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-94ptf" Apr 24 16:41:41.662576 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.662220 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-stqch" podStartSLOduration=1.9131251809999998 podStartE2EDuration="5.662201768s" podCreationTimestamp="2026-04-24 16:41:36 +0000 UTC" firstStartedPulling="2026-04-24 16:41:36.927266528 +0000 UTC m=+149.384580377" lastFinishedPulling="2026-04-24 16:41:40.676343115 +0000 UTC m=+153.133656964" observedRunningTime="2026-04-24 16:41:41.660893069 +0000 UTC m=+154.118206943" watchObservedRunningTime="2026-04-24 16:41:41.662201768 +0000 UTC m=+154.119515641" Apr 24 16:41:41.706712 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.706664 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf" podStartSLOduration=2.374208263 podStartE2EDuration="5.70664722s" podCreationTimestamp="2026-04-24 16:41:36 +0000 UTC" firstStartedPulling="2026-04-24 16:41:37.350946071 +0000 UTC m=+149.808259923" lastFinishedPulling="2026-04-24 16:41:40.683385026 +0000 UTC m=+153.140698880" observedRunningTime="2026-04-24 16:41:41.706210695 +0000 UTC m=+154.163524568" watchObservedRunningTime="2026-04-24 16:41:41.70664722 +0000 UTC m=+154.163961091" Apr 24 16:41:41.735691 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.735642 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-jltdt" podStartSLOduration=2.405699664 podStartE2EDuration="5.735623467s" podCreationTimestamp="2026-04-24 16:41:36 +0000 UTC" firstStartedPulling="2026-04-24 16:41:37.349897823 +0000 UTC m=+149.807211673" lastFinishedPulling="2026-04-24 16:41:40.679821612 +0000 UTC m=+153.137135476" observedRunningTime="2026-04-24 16:41:41.73447609 +0000 UTC m=+154.191789965" watchObservedRunningTime="2026-04-24 16:41:41.735623467 +0000 UTC m=+154.192937339" Apr 24 16:41:41.760030 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.759996 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wr4cm\" (UniqueName: \"kubernetes.io/projected/d1584eff-fb10-45b7-979a-deabbfef9402-kube-api-access-wr4cm\") pod \"migrator-74bb7799d9-94ptf\" (UID: \"d1584eff-fb10-45b7-979a-deabbfef9402\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-94ptf" Apr 24 16:41:41.770225 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.770198 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr4cm\" (UniqueName: \"kubernetes.io/projected/d1584eff-fb10-45b7-979a-deabbfef9402-kube-api-access-wr4cm\") pod \"migrator-74bb7799d9-94ptf\" (UID: \"d1584eff-fb10-45b7-979a-deabbfef9402\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-94ptf" Apr 24 16:41:41.860160 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.860079 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-94ptf" Apr 24 16:41:41.974666 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:41.974628 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-94ptf"] Apr 24 16:41:41.977317 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:41:41.977288 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1584eff_fb10_45b7_979a_deabbfef9402.slice/crio-7c1c441d9a092ca52d4202b3c187200ee12869ce13826c7dc7f9683ff40b2d10 WatchSource:0}: Error finding container 7c1c441d9a092ca52d4202b3c187200ee12869ce13826c7dc7f9683ff40b2d10: Status 404 returned error can't find the container with id 7c1c441d9a092ca52d4202b3c187200ee12869ce13826c7dc7f9683ff40b2d10 Apr 24 16:41:42.636359 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:42.636326 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-99ws4_7235e10c-761b-4d2d-a4f9-2d8114898c5d/console-operator/1.log" Apr 24 16:41:42.636837 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:42.636817 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-99ws4_7235e10c-761b-4d2d-a4f9-2d8114898c5d/console-operator/0.log" Apr 24 16:41:42.636889 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:42.636860 2562 generic.go:358] "Generic (PLEG): container finished" podID="7235e10c-761b-4d2d-a4f9-2d8114898c5d" containerID="bb5a2dc08b18c86fa1fd2968c44f865a36a78d5882861eb37fc09efc8dc75a7d" exitCode=255 Apr 24 16:41:42.636926 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:42.636894 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" event={"ID":"7235e10c-761b-4d2d-a4f9-2d8114898c5d","Type":"ContainerDied","Data":"bb5a2dc08b18c86fa1fd2968c44f865a36a78d5882861eb37fc09efc8dc75a7d"} Apr 24 16:41:42.637022 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:42.636961 2562 scope.go:117] "RemoveContainer" containerID="4331ae6522973639f6a2719b213716893f17c72de9611d065c103834e8a3c1cb" Apr 24 16:41:42.637302 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:42.637280 2562 scope.go:117] "RemoveContainer" containerID="bb5a2dc08b18c86fa1fd2968c44f865a36a78d5882861eb37fc09efc8dc75a7d" Apr 24 16:41:42.637524 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:42.637503 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-99ws4_openshift-console-operator(7235e10c-761b-4d2d-a4f9-2d8114898c5d)\"" pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" podUID="7235e10c-761b-4d2d-a4f9-2d8114898c5d" Apr 24 16:41:42.638212 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:42.638140 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-94ptf" event={"ID":"d1584eff-fb10-45b7-979a-deabbfef9402","Type":"ContainerStarted","Data":"7c1c441d9a092ca52d4202b3c187200ee12869ce13826c7dc7f9683ff40b2d10"} Apr 24 16:41:43.170672 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:43.170600 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m9vmw_9a9da436-211c-45bd-9f7b-51e5eea9f69e/dns-node-resolver/0.log" Apr 24 16:41:43.643051 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:43.643025 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-99ws4_7235e10c-761b-4d2d-a4f9-2d8114898c5d/console-operator/1.log" Apr 24 16:41:43.643516 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:43.643442 2562 scope.go:117] "RemoveContainer" containerID="bb5a2dc08b18c86fa1fd2968c44f865a36a78d5882861eb37fc09efc8dc75a7d" Apr 24 16:41:43.643673 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:43.643652 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-99ws4_openshift-console-operator(7235e10c-761b-4d2d-a4f9-2d8114898c5d)\"" pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" podUID="7235e10c-761b-4d2d-a4f9-2d8114898c5d" Apr 24 16:41:43.644604 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:43.644571 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-94ptf" event={"ID":"d1584eff-fb10-45b7-979a-deabbfef9402","Type":"ContainerStarted","Data":"81a1569f11725793663d621e7ce201997c8d6f9b1c9cfdfb8672604369d7d3dc"} Apr 24 16:41:43.644721 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:43.644606 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-94ptf" event={"ID":"d1584eff-fb10-45b7-979a-deabbfef9402","Type":"ContainerStarted","Data":"1a2911cf567b1b2ed9c20ccb435c28c278f0e0cdca136829924c5d64b2c9f04d"} Apr 24 16:41:43.677573 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:43.677438 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-94ptf" podStartSLOduration=1.823194934 podStartE2EDuration="2.677421004s" podCreationTimestamp="2026-04-24 16:41:41 +0000 UTC" firstStartedPulling="2026-04-24 16:41:41.979065277 +0000 UTC m=+154.436379129" lastFinishedPulling="2026-04-24 16:41:42.833291347 +0000 UTC m=+155.290605199" observedRunningTime="2026-04-24 16:41:43.677066791 +0000 UTC m=+156.134380661" watchObservedRunningTime="2026-04-24 16:41:43.677421004 +0000 UTC m=+156.134734876" Apr 24 16:41:43.958441 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:43.958352 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" podUID="e8f243a6-3111-4778-9ca9-092db5973836" Apr 24 16:41:43.971420 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:43.971401 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-w297f_e5357c94-3bc4-4825-808b-68599eb79e96/node-ca/0.log" Apr 24 16:41:44.010131 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:44.010078 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-dj6nz" podUID="bf7a70b6-9c3d-42f2-912b-ae46ee6adb21" Apr 24 16:41:44.028415 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:44.028383 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zgxq7" podUID="444f46c8-3c9a-4e72-8000-ca142ae511ef" Apr 24 16:41:44.126402 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.126369 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-vr6mk"] Apr 24 16:41:44.128372 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.128355 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-vr6mk" Apr 24 16:41:44.130634 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.130612 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 16:41:44.130733 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.130608 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-jfrvv\"" Apr 24 16:41:44.131697 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.131675 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 16:41:44.131800 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.131722 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 16:41:44.131800 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.131759 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 16:41:44.136713 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.136694 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-vr6mk"] Apr 24 16:41:44.182080 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.182048 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vv76m\" (UID: \"8fb63bb2-c5b7-4326-9e09-94fdeae1f646\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m" Apr 24 16:41:44.182221 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:44.182183 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 16:41:44.182280 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:44.182236 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-samples-operator-tls podName:8fb63bb2-c5b7-4326-9e09-94fdeae1f646 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:52.182222672 +0000 UTC m=+164.639536525 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vv76m" (UID: "8fb63bb2-c5b7-4326-9e09-94fdeae1f646") : secret "samples-operator-tls" not found Apr 24 16:41:44.186063 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:44.186027 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-wp5zn" podUID="93f47728-cffc-4da9-9791-92c3d70ac2d2" Apr 24 16:41:44.282689 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.282654 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/313cd8e7-aa99-471d-81e0-943bcc07f3f9-signing-key\") pod \"service-ca-865cb79987-vr6mk\" (UID: \"313cd8e7-aa99-471d-81e0-943bcc07f3f9\") " pod="openshift-service-ca/service-ca-865cb79987-vr6mk" Apr 24 16:41:44.282689 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.282691 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwr45\" (UniqueName: \"kubernetes.io/projected/313cd8e7-aa99-471d-81e0-943bcc07f3f9-kube-api-access-hwr45\") pod \"service-ca-865cb79987-vr6mk\" (UID: \"313cd8e7-aa99-471d-81e0-943bcc07f3f9\") " pod="openshift-service-ca/service-ca-865cb79987-vr6mk" Apr 24 16:41:44.282903 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.282837 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/313cd8e7-aa99-471d-81e0-943bcc07f3f9-signing-cabundle\") pod \"service-ca-865cb79987-vr6mk\" (UID: \"313cd8e7-aa99-471d-81e0-943bcc07f3f9\") " pod="openshift-service-ca/service-ca-865cb79987-vr6mk" Apr 24 16:41:44.384106 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.384072 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/313cd8e7-aa99-471d-81e0-943bcc07f3f9-signing-key\") pod \"service-ca-865cb79987-vr6mk\" (UID: \"313cd8e7-aa99-471d-81e0-943bcc07f3f9\") " pod="openshift-service-ca/service-ca-865cb79987-vr6mk" Apr 24 16:41:44.384106 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.384108 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwr45\" (UniqueName: \"kubernetes.io/projected/313cd8e7-aa99-471d-81e0-943bcc07f3f9-kube-api-access-hwr45\") pod \"service-ca-865cb79987-vr6mk\" (UID: \"313cd8e7-aa99-471d-81e0-943bcc07f3f9\") " pod="openshift-service-ca/service-ca-865cb79987-vr6mk" Apr 24 16:41:44.384340 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.384190 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/313cd8e7-aa99-471d-81e0-943bcc07f3f9-signing-cabundle\") pod \"service-ca-865cb79987-vr6mk\" (UID: \"313cd8e7-aa99-471d-81e0-943bcc07f3f9\") " pod="openshift-service-ca/service-ca-865cb79987-vr6mk" Apr 24 16:41:44.384340 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.384217 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n5n4w\" (UID: \"cc0177b3-ada2-4478-8a4f-354de52d1414\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" Apr 24 16:41:44.384340 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:44.384313 2562 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:44.384508 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:44.384376 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls podName:cc0177b3-ada2-4478-8a4f-354de52d1414 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:52.38435808 +0000 UTC m=+164.841671930 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-n5n4w" (UID: "cc0177b3-ada2-4478-8a4f-354de52d1414") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:44.384886 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.384859 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/313cd8e7-aa99-471d-81e0-943bcc07f3f9-signing-cabundle\") pod \"service-ca-865cb79987-vr6mk\" (UID: \"313cd8e7-aa99-471d-81e0-943bcc07f3f9\") " pod="openshift-service-ca/service-ca-865cb79987-vr6mk" Apr 24 16:41:44.386483 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.386453 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/313cd8e7-aa99-471d-81e0-943bcc07f3f9-signing-key\") pod \"service-ca-865cb79987-vr6mk\" (UID: \"313cd8e7-aa99-471d-81e0-943bcc07f3f9\") " pod="openshift-service-ca/service-ca-865cb79987-vr6mk" Apr 24 16:41:44.392608 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.392586 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwr45\" (UniqueName: \"kubernetes.io/projected/313cd8e7-aa99-471d-81e0-943bcc07f3f9-kube-api-access-hwr45\") pod \"service-ca-865cb79987-vr6mk\" (UID: \"313cd8e7-aa99-471d-81e0-943bcc07f3f9\") " pod="openshift-service-ca/service-ca-865cb79987-vr6mk" Apr 24 16:41:44.437712 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.437664 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-vr6mk" Apr 24 16:41:44.572463 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.572372 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-vr6mk"] Apr 24 16:41:44.575822 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:41:44.575794 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod313cd8e7_aa99_471d_81e0_943bcc07f3f9.slice/crio-e013609596a90911d992162771ceae63f65d079d83e350bd70402938018f737a WatchSource:0}: Error finding container e013609596a90911d992162771ceae63f65d079d83e350bd70402938018f737a: Status 404 returned error can't find the container with id e013609596a90911d992162771ceae63f65d079d83e350bd70402938018f737a Apr 24 16:41:44.649057 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.649021 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-vr6mk" event={"ID":"313cd8e7-aa99-471d-81e0-943bcc07f3f9","Type":"ContainerStarted","Data":"56b46fb25e62f3a3e0c54788f1d84b61368a6225e6b3db7ee56cb88484fe3724"} Apr 24 16:41:44.649486 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.649064 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-vr6mk" event={"ID":"313cd8e7-aa99-471d-81e0-943bcc07f3f9","Type":"ContainerStarted","Data":"e013609596a90911d992162771ceae63f65d079d83e350bd70402938018f737a"} Apr 24 16:41:44.649486 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.649153 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dj6nz" Apr 24 16:41:44.649486 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.649272 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:41:44.649486 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.649416 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:41:44.671028 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:44.670973 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-vr6mk" podStartSLOduration=0.670926905 podStartE2EDuration="670.926905ms" podCreationTimestamp="2026-04-24 16:41:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:41:44.6697383 +0000 UTC m=+157.127052192" watchObservedRunningTime="2026-04-24 16:41:44.670926905 +0000 UTC m=+157.128240777" Apr 24 16:41:45.191712 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:45.191677 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04503171-24fb-471c-9dcc-be6c1d1b3331-service-ca-bundle\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:45.191889 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:45.191845 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/04503171-24fb-471c-9dcc-be6c1d1b3331-service-ca-bundle podName:04503171-24fb-471c-9dcc-be6c1d1b3331 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:53.191827224 +0000 UTC m=+165.649141077 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/04503171-24fb-471c-9dcc-be6c1d1b3331-service-ca-bundle") pod "router-default-d8c574-57dx2" (UID: "04503171-24fb-471c-9dcc-be6c1d1b3331") : configmap references non-existent config key: service-ca.crt Apr 24 16:41:45.292777 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:45.292741 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-metrics-certs\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:45.292928 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:45.292871 2562 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 16:41:45.292928 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:45.292924 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-metrics-certs podName:04503171-24fb-471c-9dcc-be6c1d1b3331 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:53.292910225 +0000 UTC m=+165.750224075 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-metrics-certs") pod "router-default-d8c574-57dx2" (UID: "04503171-24fb-471c-9dcc-be6c1d1b3331") : secret "router-metrics-certs-default" not found Apr 24 16:41:46.662375 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:46.662343 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" Apr 24 16:41:46.662778 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:46.662391 2562 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" Apr 24 16:41:46.662854 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:46.662838 2562 scope.go:117] "RemoveContainer" containerID="bb5a2dc08b18c86fa1fd2968c44f865a36a78d5882861eb37fc09efc8dc75a7d" Apr 24 16:41:46.663096 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:46.663074 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-99ws4_openshift-console-operator(7235e10c-761b-4d2d-a4f9-2d8114898c5d)\"" pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" podUID="7235e10c-761b-4d2d-a4f9-2d8114898c5d" Apr 24 16:41:48.927114 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:48.927069 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:41:48.927622 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:48.927183 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:41:48.927622 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:48.927209 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert\") pod \"ingress-canary-zgxq7\" (UID: \"444f46c8-3c9a-4e72-8000-ca142ae511ef\") " pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:41:48.927622 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:48.927211 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:41:48.927622 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:48.927285 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls podName:bf7a70b6-9c3d-42f2-912b-ae46ee6adb21 nodeName:}" failed. No retries permitted until 2026-04-24 16:43:50.927268211 +0000 UTC m=+283.384582060 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls") pod "dns-default-dj6nz" (UID: "bf7a70b6-9c3d-42f2-912b-ae46ee6adb21") : secret "dns-default-metrics-tls" not found Apr 24 16:41:48.927622 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:48.927308 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:41:48.927622 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:48.927315 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:41:48.927622 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:48.927328 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56d5dcfb77-vmzmz: secret "image-registry-tls" not found Apr 24 16:41:48.927622 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:48.927374 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert podName:444f46c8-3c9a-4e72-8000-ca142ae511ef nodeName:}" failed. No retries permitted until 2026-04-24 16:43:50.927358909 +0000 UTC m=+283.384672758 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert") pod "ingress-canary-zgxq7" (UID: "444f46c8-3c9a-4e72-8000-ca142ae511ef") : secret "canary-serving-cert" not found Apr 24 16:41:48.927622 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:48.927389 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls podName:e8f243a6-3111-4778-9ca9-092db5973836 nodeName:}" failed. No retries permitted until 2026-04-24 16:43:50.927381696 +0000 UTC m=+283.384695544 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls") pod "image-registry-56d5dcfb77-vmzmz" (UID: "e8f243a6-3111-4778-9ca9-092db5973836") : secret "image-registry-tls" not found Apr 24 16:41:52.254458 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:52.254416 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vv76m\" (UID: \"8fb63bb2-c5b7-4326-9e09-94fdeae1f646\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m" Apr 24 16:41:52.257022 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:52.256995 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fb63bb2-c5b7-4326-9e09-94fdeae1f646-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vv76m\" (UID: \"8fb63bb2-c5b7-4326-9e09-94fdeae1f646\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m" Apr 24 16:41:52.456265 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:52.456221 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n5n4w\" (UID: \"cc0177b3-ada2-4478-8a4f-354de52d1414\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" Apr 24 16:41:52.456438 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:52.456356 2562 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:52.456438 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:41:52.456421 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls podName:cc0177b3-ada2-4478-8a4f-354de52d1414 nodeName:}" failed. No retries permitted until 2026-04-24 16:42:08.456405333 +0000 UTC m=+180.913719182 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-n5n4w" (UID: "cc0177b3-ada2-4478-8a4f-354de52d1414") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:52.555738 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:52.555663 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m" Apr 24 16:41:52.675031 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:52.675001 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m"] Apr 24 16:41:53.263950 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:53.263900 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04503171-24fb-471c-9dcc-be6c1d1b3331-service-ca-bundle\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:53.264597 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:53.264574 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04503171-24fb-471c-9dcc-be6c1d1b3331-service-ca-bundle\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:53.364724 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:53.364688 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-metrics-certs\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:53.367427 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:53.367399 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/04503171-24fb-471c-9dcc-be6c1d1b3331-metrics-certs\") pod \"router-default-d8c574-57dx2\" (UID: \"04503171-24fb-471c-9dcc-be6c1d1b3331\") " pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:53.598072 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:53.597988 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:53.680112 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:53.679465 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m" event={"ID":"8fb63bb2-c5b7-4326-9e09-94fdeae1f646","Type":"ContainerStarted","Data":"89b9a0ca7f4be7bb36dd23323aa32746389cf4f83d6c45858c256c01516bbe80"} Apr 24 16:41:53.729752 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:53.729727 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-d8c574-57dx2"] Apr 24 16:41:53.732264 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:41:53.732238 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04503171_24fb_471c_9dcc_be6c1d1b3331.slice/crio-02ce8e57e19ca93b7996b2c707694c1d53909aa02f7d88c70859245bc7325693 WatchSource:0}: Error finding container 02ce8e57e19ca93b7996b2c707694c1d53909aa02f7d88c70859245bc7325693: Status 404 returned error can't find the container with id 02ce8e57e19ca93b7996b2c707694c1d53909aa02f7d88c70859245bc7325693 Apr 24 16:41:54.683686 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:54.683593 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m" event={"ID":"8fb63bb2-c5b7-4326-9e09-94fdeae1f646","Type":"ContainerStarted","Data":"055b9dd8610b5c0f14701f49afb2a74b36f20ff8db12b702be5452acc03ed834"} Apr 24 16:41:54.683686 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:54.683640 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m" event={"ID":"8fb63bb2-c5b7-4326-9e09-94fdeae1f646","Type":"ContainerStarted","Data":"b11877e37614c9e53f00498df95330c8f3a610546a1cd08b5df8af18fa4261fe"} Apr 24 16:41:54.684872 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:54.684847 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-d8c574-57dx2" event={"ID":"04503171-24fb-471c-9dcc-be6c1d1b3331","Type":"ContainerStarted","Data":"9e621d3fe15bff1bacc2de2a3df595347301c29d343e59881d6b57a2f10a67d6"} Apr 24 16:41:54.684986 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:54.684878 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-d8c574-57dx2" event={"ID":"04503171-24fb-471c-9dcc-be6c1d1b3331","Type":"ContainerStarted","Data":"02ce8e57e19ca93b7996b2c707694c1d53909aa02f7d88c70859245bc7325693"} Apr 24 16:41:54.706398 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:54.706344 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vv76m" podStartSLOduration=17.173191377 podStartE2EDuration="18.706330365s" podCreationTimestamp="2026-04-24 16:41:36 +0000 UTC" firstStartedPulling="2026-04-24 16:41:52.724020051 +0000 UTC m=+165.181333906" lastFinishedPulling="2026-04-24 16:41:54.257159035 +0000 UTC m=+166.714472894" observedRunningTime="2026-04-24 16:41:54.704885274 +0000 UTC m=+167.162199143" watchObservedRunningTime="2026-04-24 16:41:54.706330365 +0000 UTC m=+167.163644235" Apr 24 16:41:54.724315 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:54.724268 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-d8c574-57dx2" podStartSLOduration=18.724254621 podStartE2EDuration="18.724254621s" podCreationTimestamp="2026-04-24 16:41:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:41:54.723916032 +0000 UTC m=+167.181229905" watchObservedRunningTime="2026-04-24 16:41:54.724254621 +0000 UTC m=+167.181568492" Apr 24 16:41:55.173172 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:55.173133 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:41:55.598252 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:55.598215 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:55.600760 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:55.600736 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:55.687635 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:55.687608 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:55.688733 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:55.688713 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-d8c574-57dx2" Apr 24 16:41:58.174511 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:58.174421 2562 scope.go:117] "RemoveContainer" containerID="bb5a2dc08b18c86fa1fd2968c44f865a36a78d5882861eb37fc09efc8dc75a7d" Apr 24 16:41:58.697555 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:58.697526 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-99ws4_7235e10c-761b-4d2d-a4f9-2d8114898c5d/console-operator/1.log" Apr 24 16:41:58.697727 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:58.697592 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" event={"ID":"7235e10c-761b-4d2d-a4f9-2d8114898c5d","Type":"ContainerStarted","Data":"8ea4036ce2ea92223a4c4669ae1b748183ce311e1dc82b9badb9463a076215f9"} Apr 24 16:41:58.697912 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:58.697888 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" Apr 24 16:41:58.702608 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:58.702584 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" Apr 24 16:41:58.716437 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:41:58.716394 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-99ws4" podStartSLOduration=18.824691535 podStartE2EDuration="22.716381743s" podCreationTimestamp="2026-04-24 16:41:36 +0000 UTC" firstStartedPulling="2026-04-24 16:41:36.786496647 +0000 UTC m=+149.243810497" lastFinishedPulling="2026-04-24 16:41:40.67818685 +0000 UTC m=+153.135500705" observedRunningTime="2026-04-24 16:41:58.715984897 +0000 UTC m=+171.173298767" watchObservedRunningTime="2026-04-24 16:41:58.716381743 +0000 UTC m=+171.173695612" Apr 24 16:42:00.703831 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:00.703793 2562 generic.go:358] "Generic (PLEG): container finished" podID="5e812eb2-480c-4a84-b382-d7c26bd0da17" containerID="75344998c22843bc3d0cd65de3c82e582673116e7c5fd3b97a1d05e1e4fdf0ea" exitCode=255 Apr 24 16:42:00.704322 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:00.703867 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6" event={"ID":"5e812eb2-480c-4a84-b382-d7c26bd0da17","Type":"ContainerDied","Data":"75344998c22843bc3d0cd65de3c82e582673116e7c5fd3b97a1d05e1e4fdf0ea"} Apr 24 16:42:00.709903 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:00.709884 2562 scope.go:117] "RemoveContainer" containerID="75344998c22843bc3d0cd65de3c82e582673116e7c5fd3b97a1d05e1e4fdf0ea" Apr 24 16:42:01.708271 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:01.708237 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6" event={"ID":"5e812eb2-480c-4a84-b382-d7c26bd0da17","Type":"ContainerStarted","Data":"41883531f55cb97b3f74d173839eb59860d72c506e6d75d325f66212b6354e97"} Apr 24 16:42:05.290029 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.289998 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9rqvf"] Apr 24 16:42:05.339570 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.339539 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9rqvf"] Apr 24 16:42:05.339737 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.339681 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9rqvf" Apr 24 16:42:05.342318 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.342291 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zvkrd\"" Apr 24 16:42:05.342447 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.342317 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 16:42:05.342447 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.342291 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 16:42:05.470542 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.470505 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kldtz\" (UniqueName: \"kubernetes.io/projected/31adc9d7-5a88-4a54-885a-6d3a7451a4d5-kube-api-access-kldtz\") pod \"insights-runtime-extractor-9rqvf\" (UID: \"31adc9d7-5a88-4a54-885a-6d3a7451a4d5\") " pod="openshift-insights/insights-runtime-extractor-9rqvf" Apr 24 16:42:05.470542 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.470543 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/31adc9d7-5a88-4a54-885a-6d3a7451a4d5-data-volume\") pod \"insights-runtime-extractor-9rqvf\" (UID: \"31adc9d7-5a88-4a54-885a-6d3a7451a4d5\") " pod="openshift-insights/insights-runtime-extractor-9rqvf" Apr 24 16:42:05.470806 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.470563 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/31adc9d7-5a88-4a54-885a-6d3a7451a4d5-crio-socket\") pod \"insights-runtime-extractor-9rqvf\" (UID: \"31adc9d7-5a88-4a54-885a-6d3a7451a4d5\") " pod="openshift-insights/insights-runtime-extractor-9rqvf" Apr 24 16:42:05.470806 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.470610 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/31adc9d7-5a88-4a54-885a-6d3a7451a4d5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9rqvf\" (UID: \"31adc9d7-5a88-4a54-885a-6d3a7451a4d5\") " pod="openshift-insights/insights-runtime-extractor-9rqvf" Apr 24 16:42:05.470806 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.470742 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/31adc9d7-5a88-4a54-885a-6d3a7451a4d5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9rqvf\" (UID: \"31adc9d7-5a88-4a54-885a-6d3a7451a4d5\") " pod="openshift-insights/insights-runtime-extractor-9rqvf" Apr 24 16:42:05.571439 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.571359 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/31adc9d7-5a88-4a54-885a-6d3a7451a4d5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9rqvf\" (UID: \"31adc9d7-5a88-4a54-885a-6d3a7451a4d5\") " pod="openshift-insights/insights-runtime-extractor-9rqvf" Apr 24 16:42:05.571439 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.571429 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kldtz\" (UniqueName: \"kubernetes.io/projected/31adc9d7-5a88-4a54-885a-6d3a7451a4d5-kube-api-access-kldtz\") pod \"insights-runtime-extractor-9rqvf\" (UID: \"31adc9d7-5a88-4a54-885a-6d3a7451a4d5\") " pod="openshift-insights/insights-runtime-extractor-9rqvf" Apr 24 16:42:05.571671 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.571579 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/31adc9d7-5a88-4a54-885a-6d3a7451a4d5-data-volume\") pod \"insights-runtime-extractor-9rqvf\" (UID: \"31adc9d7-5a88-4a54-885a-6d3a7451a4d5\") " pod="openshift-insights/insights-runtime-extractor-9rqvf" Apr 24 16:42:05.571671 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.571618 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/31adc9d7-5a88-4a54-885a-6d3a7451a4d5-crio-socket\") pod \"insights-runtime-extractor-9rqvf\" (UID: \"31adc9d7-5a88-4a54-885a-6d3a7451a4d5\") " pod="openshift-insights/insights-runtime-extractor-9rqvf" Apr 24 16:42:05.571671 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.571641 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/31adc9d7-5a88-4a54-885a-6d3a7451a4d5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9rqvf\" (UID: \"31adc9d7-5a88-4a54-885a-6d3a7451a4d5\") " pod="openshift-insights/insights-runtime-extractor-9rqvf" Apr 24 16:42:05.571815 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.571717 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/31adc9d7-5a88-4a54-885a-6d3a7451a4d5-crio-socket\") pod \"insights-runtime-extractor-9rqvf\" (UID: \"31adc9d7-5a88-4a54-885a-6d3a7451a4d5\") " pod="openshift-insights/insights-runtime-extractor-9rqvf" Apr 24 16:42:05.572018 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.571996 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/31adc9d7-5a88-4a54-885a-6d3a7451a4d5-data-volume\") pod \"insights-runtime-extractor-9rqvf\" (UID: \"31adc9d7-5a88-4a54-885a-6d3a7451a4d5\") " pod="openshift-insights/insights-runtime-extractor-9rqvf" Apr 24 16:42:05.572228 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.572211 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/31adc9d7-5a88-4a54-885a-6d3a7451a4d5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9rqvf\" (UID: \"31adc9d7-5a88-4a54-885a-6d3a7451a4d5\") " pod="openshift-insights/insights-runtime-extractor-9rqvf" Apr 24 16:42:05.573731 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.573708 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/31adc9d7-5a88-4a54-885a-6d3a7451a4d5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9rqvf\" (UID: \"31adc9d7-5a88-4a54-885a-6d3a7451a4d5\") " pod="openshift-insights/insights-runtime-extractor-9rqvf" Apr 24 16:42:05.584150 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.584119 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kldtz\" (UniqueName: \"kubernetes.io/projected/31adc9d7-5a88-4a54-885a-6d3a7451a4d5-kube-api-access-kldtz\") pod \"insights-runtime-extractor-9rqvf\" (UID: \"31adc9d7-5a88-4a54-885a-6d3a7451a4d5\") " pod="openshift-insights/insights-runtime-extractor-9rqvf" Apr 24 16:42:05.650171 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.650144 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9rqvf" Apr 24 16:42:05.771467 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:05.771444 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9rqvf"] Apr 24 16:42:05.774110 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:42:05.774078 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31adc9d7_5a88_4a54_885a_6d3a7451a4d5.slice/crio-80adc249ab523b9e4e8fd5467aed3088c128e99970507f9c2b3276cfb050b3b5 WatchSource:0}: Error finding container 80adc249ab523b9e4e8fd5467aed3088c128e99970507f9c2b3276cfb050b3b5: Status 404 returned error can't find the container with id 80adc249ab523b9e4e8fd5467aed3088c128e99970507f9c2b3276cfb050b3b5 Apr 24 16:42:06.722300 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:06.722265 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9rqvf" event={"ID":"31adc9d7-5a88-4a54-885a-6d3a7451a4d5","Type":"ContainerStarted","Data":"4c3f5bd277d21029c95ceb62ea5eb0a59b12ca82a6c1ef296b8e089b2f524e60"} Apr 24 16:42:06.722300 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:06.722303 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9rqvf" event={"ID":"31adc9d7-5a88-4a54-885a-6d3a7451a4d5","Type":"ContainerStarted","Data":"80adc249ab523b9e4e8fd5467aed3088c128e99970507f9c2b3276cfb050b3b5"} Apr 24 16:42:07.726770 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:07.726734 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9rqvf" event={"ID":"31adc9d7-5a88-4a54-885a-6d3a7451a4d5","Type":"ContainerStarted","Data":"847a86bbcac98febb8e5c9657c2c245e5c9d564f5adefc0635a7dce8ea3cc131"} Apr 24 16:42:08.499246 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.499210 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n5n4w\" (UID: \"cc0177b3-ada2-4478-8a4f-354de52d1414\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" Apr 24 16:42:08.501639 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.501606 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc0177b3-ada2-4478-8a4f-354de52d1414-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n5n4w\" (UID: \"cc0177b3-ada2-4478-8a4f-354de52d1414\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" Apr 24 16:42:08.551563 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.551530 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f6b6b6c98-mzqb5"] Apr 24 16:42:08.553952 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.553912 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.556615 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.556585 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 16:42:08.556745 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.556661 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-6wdtj\"" Apr 24 16:42:08.556745 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.556665 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 16:42:08.556968 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.556669 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 16:42:08.557044 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.556970 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 16:42:08.557190 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.557173 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 16:42:08.557319 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.557273 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 16:42:08.557319 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.557273 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 16:42:08.564696 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.564675 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f6b6b6c98-mzqb5"] Apr 24 16:42:08.599797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.599769 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-console-serving-cert\") pod \"console-7f6b6b6c98-mzqb5\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.599797 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.599798 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-service-ca\") pod \"console-7f6b6b6c98-mzqb5\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.600050 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.599895 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-oauth-serving-cert\") pod \"console-7f6b6b6c98-mzqb5\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.600050 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.599928 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-console-config\") pod \"console-7f6b6b6c98-mzqb5\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.600050 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.599970 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llsf8\" (UniqueName: \"kubernetes.io/projected/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-kube-api-access-llsf8\") pod \"console-7f6b6b6c98-mzqb5\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.600050 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.600035 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-console-oauth-config\") pod \"console-7f6b6b6c98-mzqb5\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.701338 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.701255 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-console-oauth-config\") pod \"console-7f6b6b6c98-mzqb5\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.701338 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.701303 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-console-serving-cert\") pod \"console-7f6b6b6c98-mzqb5\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.701338 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.701328 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-service-ca\") pod \"console-7f6b6b6c98-mzqb5\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.701581 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.701393 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-oauth-serving-cert\") pod \"console-7f6b6b6c98-mzqb5\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.701581 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.701439 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-console-config\") pod \"console-7f6b6b6c98-mzqb5\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.701581 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.701470 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llsf8\" (UniqueName: \"kubernetes.io/projected/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-kube-api-access-llsf8\") pod \"console-7f6b6b6c98-mzqb5\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.702231 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.702198 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-service-ca\") pod \"console-7f6b6b6c98-mzqb5\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.702408 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.702226 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-console-config\") pod \"console-7f6b6b6c98-mzqb5\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.702408 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.702242 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-oauth-serving-cert\") pod \"console-7f6b6b6c98-mzqb5\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.704028 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.704001 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-console-oauth-config\") pod \"console-7f6b6b6c98-mzqb5\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.704116 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.704012 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-console-serving-cert\") pod \"console-7f6b6b6c98-mzqb5\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.710173 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.710149 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llsf8\" (UniqueName: \"kubernetes.io/projected/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-kube-api-access-llsf8\") pod \"console-7f6b6b6c98-mzqb5\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.728689 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.728455 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-rd7fz\"" Apr 24 16:42:08.732649 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.732625 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9rqvf" event={"ID":"31adc9d7-5a88-4a54-885a-6d3a7451a4d5","Type":"ContainerStarted","Data":"22e662d7d40feaac5bd90ada70cc40b0775f653224d0135bd4851a94552ed2c5"} Apr 24 16:42:08.736267 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.736250 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" Apr 24 16:42:08.750654 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.750609 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9rqvf" podStartSLOduration=1.2638039 podStartE2EDuration="3.750595253s" podCreationTimestamp="2026-04-24 16:42:05 +0000 UTC" firstStartedPulling="2026-04-24 16:42:05.877301202 +0000 UTC m=+178.334615055" lastFinishedPulling="2026-04-24 16:42:08.364092544 +0000 UTC m=+180.821406408" observedRunningTime="2026-04-24 16:42:08.749973418 +0000 UTC m=+181.207287289" watchObservedRunningTime="2026-04-24 16:42:08.750595253 +0000 UTC m=+181.207909124" Apr 24 16:42:08.853008 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.852917 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w"] Apr 24 16:42:08.857769 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:42:08.857738 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc0177b3_ada2_4478_8a4f_354de52d1414.slice/crio-58225b4e4a46434689eb59957d77858c7fb7d32bd5ddf085b86be01523d339b5 WatchSource:0}: Error finding container 58225b4e4a46434689eb59957d77858c7fb7d32bd5ddf085b86be01523d339b5: Status 404 returned error can't find the container with id 58225b4e4a46434689eb59957d77858c7fb7d32bd5ddf085b86be01523d339b5 Apr 24 16:42:08.863049 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.863026 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:08.977695 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:08.977665 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f6b6b6c98-mzqb5"] Apr 24 16:42:08.980819 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:42:08.980794 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba76f817_dd2f_4bdd_ba73_a83baa5abee0.slice/crio-d993a3e3b72c3b21a1c99c2323fe643d43939e7f5f975709b03c8d92b62f1e4b WatchSource:0}: Error finding container d993a3e3b72c3b21a1c99c2323fe643d43939e7f5f975709b03c8d92b62f1e4b: Status 404 returned error can't find the container with id d993a3e3b72c3b21a1c99c2323fe643d43939e7f5f975709b03c8d92b62f1e4b Apr 24 16:42:09.737554 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:09.737510 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f6b6b6c98-mzqb5" event={"ID":"ba76f817-dd2f-4bdd-ba73-a83baa5abee0","Type":"ContainerStarted","Data":"d993a3e3b72c3b21a1c99c2323fe643d43939e7f5f975709b03c8d92b62f1e4b"} Apr 24 16:42:09.739265 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:09.739227 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" event={"ID":"cc0177b3-ada2-4478-8a4f-354de52d1414","Type":"ContainerStarted","Data":"58225b4e4a46434689eb59957d77858c7fb7d32bd5ddf085b86be01523d339b5"} Apr 24 16:42:10.744101 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:10.744028 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" event={"ID":"cc0177b3-ada2-4478-8a4f-354de52d1414","Type":"ContainerStarted","Data":"86059fc59322b509306a8c2300421db21fa59722c5d3babdfccf4f588cc3472b"} Apr 24 16:42:10.761942 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:10.761882 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n5n4w" podStartSLOduration=33.064232677 podStartE2EDuration="34.761863344s" podCreationTimestamp="2026-04-24 16:41:36 +0000 UTC" firstStartedPulling="2026-04-24 16:42:08.859585103 +0000 UTC m=+181.316898955" lastFinishedPulling="2026-04-24 16:42:10.557215762 +0000 UTC m=+183.014529622" observedRunningTime="2026-04-24 16:42:10.761546498 +0000 UTC m=+183.218860369" watchObservedRunningTime="2026-04-24 16:42:10.761863344 +0000 UTC m=+183.219177217" Apr 24 16:42:11.097405 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:11.097371 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hx5th"] Apr 24 16:42:11.099759 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:11.099735 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hx5th" Apr 24 16:42:11.103530 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:11.103297 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-s484d\"" Apr 24 16:42:11.103735 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:11.103711 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 16:42:11.109877 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:11.109789 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hx5th"] Apr 24 16:42:11.224826 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:11.224791 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/da0109d3-7192-4ad8-a35a-b4c4e4438cbe-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hx5th\" (UID: \"da0109d3-7192-4ad8-a35a-b4c4e4438cbe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hx5th" Apr 24 16:42:11.326155 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:11.326112 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/da0109d3-7192-4ad8-a35a-b4c4e4438cbe-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hx5th\" (UID: \"da0109d3-7192-4ad8-a35a-b4c4e4438cbe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hx5th" Apr 24 16:42:11.328993 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:11.328961 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/da0109d3-7192-4ad8-a35a-b4c4e4438cbe-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hx5th\" (UID: \"da0109d3-7192-4ad8-a35a-b4c4e4438cbe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hx5th" Apr 24 16:42:11.412145 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:11.412061 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hx5th" Apr 24 16:42:11.830184 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:11.830155 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hx5th"] Apr 24 16:42:11.833054 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:42:11.833024 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda0109d3_7192_4ad8_a35a_b4c4e4438cbe.slice/crio-8a5f32502ca0e8b2775f8e4bb49a1f828faeebf6d689310bc80d4ab9e5ac26ea WatchSource:0}: Error finding container 8a5f32502ca0e8b2775f8e4bb49a1f828faeebf6d689310bc80d4ab9e5ac26ea: Status 404 returned error can't find the container with id 8a5f32502ca0e8b2775f8e4bb49a1f828faeebf6d689310bc80d4ab9e5ac26ea Apr 24 16:42:12.749960 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:12.749912 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f6b6b6c98-mzqb5" event={"ID":"ba76f817-dd2f-4bdd-ba73-a83baa5abee0","Type":"ContainerStarted","Data":"b8aad618ffc46f5b611ad8d128ff9ca6f43567a7f2f9a45aaf94b7e5d4e63846"} Apr 24 16:42:12.750984 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:12.750963 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hx5th" event={"ID":"da0109d3-7192-4ad8-a35a-b4c4e4438cbe","Type":"ContainerStarted","Data":"8a5f32502ca0e8b2775f8e4bb49a1f828faeebf6d689310bc80d4ab9e5ac26ea"} Apr 24 16:42:12.769129 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:12.769077 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f6b6b6c98-mzqb5" podStartSLOduration=2.016756876 podStartE2EDuration="4.769062833s" podCreationTimestamp="2026-04-24 16:42:08 +0000 UTC" firstStartedPulling="2026-04-24 16:42:08.982700625 +0000 UTC m=+181.440014474" lastFinishedPulling="2026-04-24 16:42:11.735006582 +0000 UTC m=+184.192320431" observedRunningTime="2026-04-24 16:42:12.767688041 +0000 UTC m=+185.225001912" watchObservedRunningTime="2026-04-24 16:42:12.769062833 +0000 UTC m=+185.226376703" Apr 24 16:42:13.754906 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:13.754860 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hx5th" event={"ID":"da0109d3-7192-4ad8-a35a-b4c4e4438cbe","Type":"ContainerStarted","Data":"cff0ef686a5b8ba1339766d6847b3338e8a787f85beea67644c46560d9549374"} Apr 24 16:42:13.755413 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:13.755088 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hx5th" Apr 24 16:42:13.759818 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:13.759800 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hx5th" Apr 24 16:42:13.772292 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:13.772254 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hx5th" podStartSLOduration=1.861985707 podStartE2EDuration="2.772241871s" podCreationTimestamp="2026-04-24 16:42:11 +0000 UTC" firstStartedPulling="2026-04-24 16:42:11.834802213 +0000 UTC m=+184.292116066" lastFinishedPulling="2026-04-24 16:42:12.745058371 +0000 UTC m=+185.202372230" observedRunningTime="2026-04-24 16:42:13.771038157 +0000 UTC m=+186.228352041" watchObservedRunningTime="2026-04-24 16:42:13.772241871 +0000 UTC m=+186.229555737" Apr 24 16:42:18.863293 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:18.863256 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:18.863293 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:18.863301 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:18.864739 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:18.864713 2562 patch_prober.go:28] interesting pod/console-7f6b6b6c98-mzqb5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.21:8443/health\": dial tcp 10.133.0.21:8443: connect: connection refused" start-of-body= Apr 24 16:42:18.864846 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:18.864759 2562 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-7f6b6b6c98-mzqb5" podUID="ba76f817-dd2f-4bdd-ba73-a83baa5abee0" containerName="console" probeResult="failure" output="Get \"https://10.133.0.21:8443/health\": dial tcp 10.133.0.21:8443: connect: connection refused" Apr 24 16:42:26.549112 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.549078 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-s6tzg"] Apr 24 16:42:26.551867 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.551843 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.554630 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.554601 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 16:42:26.554630 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.554628 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-l54nx\"" Apr 24 16:42:26.555759 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.555741 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 16:42:26.555894 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.555790 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 16:42:26.555974 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.555797 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 16:42:26.567064 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.567035 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-s6tzg"] Apr 24 16:42:26.573737 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.573712 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-lq8jc"] Apr 24 16:42:26.576682 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.576661 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.579024 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.579006 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 16:42:26.579178 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.579117 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 16:42:26.579178 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.579172 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 16:42:26.579290 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.579122 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-fvkdr\"" Apr 24 16:42:26.655267 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.655229 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9kll\" (UniqueName: \"kubernetes.io/projected/c67686de-860d-4144-a49a-f0d703568add-kube-api-access-f9kll\") pod \"kube-state-metrics-69db897b98-s6tzg\" (UID: \"c67686de-860d-4144-a49a-f0d703568add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.655431 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.655293 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c67686de-860d-4144-a49a-f0d703568add-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-s6tzg\" (UID: \"c67686de-860d-4144-a49a-f0d703568add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.655431 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.655368 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c67686de-860d-4144-a49a-f0d703568add-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-s6tzg\" (UID: \"c67686de-860d-4144-a49a-f0d703568add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.655431 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.655405 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c67686de-860d-4144-a49a-f0d703568add-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-s6tzg\" (UID: \"c67686de-860d-4144-a49a-f0d703568add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.655607 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.655437 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c67686de-860d-4144-a49a-f0d703568add-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-s6tzg\" (UID: \"c67686de-860d-4144-a49a-f0d703568add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.655607 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.655463 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c67686de-860d-4144-a49a-f0d703568add-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-s6tzg\" (UID: \"c67686de-860d-4144-a49a-f0d703568add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.756359 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.756321 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f15e46f0-058b-4e83-8863-31e18b978144-node-exporter-textfile\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.756589 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.756563 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f15e46f0-058b-4e83-8863-31e18b978144-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.756712 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.756603 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f15e46f0-058b-4e83-8863-31e18b978144-node-exporter-wtmp\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.756712 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.756662 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c67686de-860d-4144-a49a-f0d703568add-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-s6tzg\" (UID: \"c67686de-860d-4144-a49a-f0d703568add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.756832 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.756764 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c67686de-860d-4144-a49a-f0d703568add-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-s6tzg\" (UID: \"c67686de-860d-4144-a49a-f0d703568add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.756832 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.756813 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c67686de-860d-4144-a49a-f0d703568add-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-s6tzg\" (UID: \"c67686de-860d-4144-a49a-f0d703568add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.756926 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.756841 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c67686de-860d-4144-a49a-f0d703568add-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-s6tzg\" (UID: \"c67686de-860d-4144-a49a-f0d703568add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.756926 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.756918 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f15e46f0-058b-4e83-8863-31e18b978144-root\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.757122 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.756990 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9kll\" (UniqueName: \"kubernetes.io/projected/c67686de-860d-4144-a49a-f0d703568add-kube-api-access-f9kll\") pod \"kube-state-metrics-69db897b98-s6tzg\" (UID: \"c67686de-860d-4144-a49a-f0d703568add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.757122 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.757033 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f15e46f0-058b-4e83-8863-31e18b978144-node-exporter-tls\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.757122 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.757090 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f15e46f0-058b-4e83-8863-31e18b978144-metrics-client-ca\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.757268 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.757131 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m5d2\" (UniqueName: \"kubernetes.io/projected/f15e46f0-058b-4e83-8863-31e18b978144-kube-api-access-8m5d2\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.757268 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.757167 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f15e46f0-058b-4e83-8863-31e18b978144-node-exporter-accelerators-collector-config\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.757268 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.757219 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c67686de-860d-4144-a49a-f0d703568add-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-s6tzg\" (UID: \"c67686de-860d-4144-a49a-f0d703568add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.757268 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.757245 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f15e46f0-058b-4e83-8863-31e18b978144-sys\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.757865 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.757846 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c67686de-860d-4144-a49a-f0d703568add-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-s6tzg\" (UID: \"c67686de-860d-4144-a49a-f0d703568add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.758382 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.758359 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c67686de-860d-4144-a49a-f0d703568add-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-s6tzg\" (UID: \"c67686de-860d-4144-a49a-f0d703568add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.758525 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.758498 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c67686de-860d-4144-a49a-f0d703568add-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-s6tzg\" (UID: \"c67686de-860d-4144-a49a-f0d703568add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.759655 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.759617 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c67686de-860d-4144-a49a-f0d703568add-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-s6tzg\" (UID: \"c67686de-860d-4144-a49a-f0d703568add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.760716 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.760693 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c67686de-860d-4144-a49a-f0d703568add-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-s6tzg\" (UID: \"c67686de-860d-4144-a49a-f0d703568add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.770043 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.770020 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9kll\" (UniqueName: \"kubernetes.io/projected/c67686de-860d-4144-a49a-f0d703568add-kube-api-access-f9kll\") pod \"kube-state-metrics-69db897b98-s6tzg\" (UID: \"c67686de-860d-4144-a49a-f0d703568add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.858073 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.857976 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f15e46f0-058b-4e83-8863-31e18b978144-node-exporter-tls\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.858073 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.858054 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f15e46f0-058b-4e83-8863-31e18b978144-metrics-client-ca\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.858290 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.858093 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8m5d2\" (UniqueName: \"kubernetes.io/projected/f15e46f0-058b-4e83-8863-31e18b978144-kube-api-access-8m5d2\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.858290 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.858125 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f15e46f0-058b-4e83-8863-31e18b978144-node-exporter-accelerators-collector-config\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.858290 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.858167 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f15e46f0-058b-4e83-8863-31e18b978144-sys\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.858290 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.858203 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f15e46f0-058b-4e83-8863-31e18b978144-node-exporter-textfile\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.858290 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.858232 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f15e46f0-058b-4e83-8863-31e18b978144-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.858290 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.858266 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f15e46f0-058b-4e83-8863-31e18b978144-sys\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.858290 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.858285 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f15e46f0-058b-4e83-8863-31e18b978144-node-exporter-wtmp\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.858625 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.858398 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f15e46f0-058b-4e83-8863-31e18b978144-root\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.858625 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.858412 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f15e46f0-058b-4e83-8863-31e18b978144-node-exporter-wtmp\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.858625 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.858452 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f15e46f0-058b-4e83-8863-31e18b978144-root\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.858777 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.858665 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f15e46f0-058b-4e83-8863-31e18b978144-node-exporter-textfile\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.858877 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.858857 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f15e46f0-058b-4e83-8863-31e18b978144-metrics-client-ca\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.858952 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.858857 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f15e46f0-058b-4e83-8863-31e18b978144-node-exporter-accelerators-collector-config\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.860416 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.860397 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f15e46f0-058b-4e83-8863-31e18b978144-node-exporter-tls\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.860849 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.860832 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f15e46f0-058b-4e83-8863-31e18b978144-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.862905 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.862881 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" Apr 24 16:42:26.873927 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.873906 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m5d2\" (UniqueName: \"kubernetes.io/projected/f15e46f0-058b-4e83-8863-31e18b978144-kube-api-access-8m5d2\") pod \"node-exporter-lq8jc\" (UID: \"f15e46f0-058b-4e83-8863-31e18b978144\") " pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.889044 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.888956 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lq8jc" Apr 24 16:42:26.899227 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:42:26.899193 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf15e46f0_058b_4e83_8863_31e18b978144.slice/crio-8dd70ff40e5de02685ed2924a793c25e11357bf727c42932fe0de5033577ad8f WatchSource:0}: Error finding container 8dd70ff40e5de02685ed2924a793c25e11357bf727c42932fe0de5033577ad8f: Status 404 returned error can't find the container with id 8dd70ff40e5de02685ed2924a793c25e11357bf727c42932fe0de5033577ad8f Apr 24 16:42:26.994494 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:26.994469 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-s6tzg"] Apr 24 16:42:26.996537 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:42:26.996503 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc67686de_860d_4144_a49a_f0d703568add.slice/crio-797652e0746f5264ea1fc7a625d7017ff217fa343cdec6b9043835b8ff3d2e61 WatchSource:0}: Error finding container 797652e0746f5264ea1fc7a625d7017ff217fa343cdec6b9043835b8ff3d2e61: Status 404 returned error can't find the container with id 797652e0746f5264ea1fc7a625d7017ff217fa343cdec6b9043835b8ff3d2e61 Apr 24 16:42:27.595147 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.594400 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 16:42:27.598008 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.597980 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.602362 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.602154 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 16:42:27.604819 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.604765 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 16:42:27.604819 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.604788 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 16:42:27.606556 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.605008 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 16:42:27.606556 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.605085 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 16:42:27.606556 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.605219 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 16:42:27.606556 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.605261 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 16:42:27.606556 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.605449 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 16:42:27.606556 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.606101 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-lrlbd\"" Apr 24 16:42:27.606556 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.606298 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 16:42:27.614321 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.611279 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 16:42:27.767314 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.767253 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.767314 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.767299 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7427106d-6c5c-457d-a96f-8d79db7264aa-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.767524 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.767398 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7427106d-6c5c-457d-a96f-8d79db7264aa-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.767524 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.767438 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7427106d-6c5c-457d-a96f-8d79db7264aa-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.767524 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.767470 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7427106d-6c5c-457d-a96f-8d79db7264aa-config-out\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.767524 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.767512 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.767725 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.767642 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.767725 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.767684 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7427106d-6c5c-457d-a96f-8d79db7264aa-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.767833 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.767733 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-config-volume\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.767833 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.767770 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm288\" (UniqueName: \"kubernetes.io/projected/7427106d-6c5c-457d-a96f-8d79db7264aa-kube-api-access-qm288\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.767833 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.767803 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.768009 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.767847 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-web-config\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.768009 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.767886 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.794012 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.793975 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" event={"ID":"c67686de-860d-4144-a49a-f0d703568add","Type":"ContainerStarted","Data":"797652e0746f5264ea1fc7a625d7017ff217fa343cdec6b9043835b8ff3d2e61"} Apr 24 16:42:27.795676 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.795635 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lq8jc" event={"ID":"f15e46f0-058b-4e83-8863-31e18b978144","Type":"ContainerStarted","Data":"9b899edc6ee5fd2315cba921c4583d9f03141218969799d740320aa44809e0db"} Apr 24 16:42:27.795676 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.795672 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lq8jc" event={"ID":"f15e46f0-058b-4e83-8863-31e18b978144","Type":"ContainerStarted","Data":"8dd70ff40e5de02685ed2924a793c25e11357bf727c42932fe0de5033577ad8f"} Apr 24 16:42:27.869601 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.869186 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-config-volume\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.869601 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.869252 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qm288\" (UniqueName: \"kubernetes.io/projected/7427106d-6c5c-457d-a96f-8d79db7264aa-kube-api-access-qm288\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.869601 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.869285 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.869601 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.869316 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-web-config\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.869601 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.869341 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.869601 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.869387 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.869601 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.869413 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7427106d-6c5c-457d-a96f-8d79db7264aa-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.869601 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.869452 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7427106d-6c5c-457d-a96f-8d79db7264aa-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.869601 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.869476 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7427106d-6c5c-457d-a96f-8d79db7264aa-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.869601 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.869599 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7427106d-6c5c-457d-a96f-8d79db7264aa-config-out\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.870270 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.869659 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.870270 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.869759 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.870270 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.869797 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7427106d-6c5c-457d-a96f-8d79db7264aa-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.870270 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.870258 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7427106d-6c5c-457d-a96f-8d79db7264aa-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.870488 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:42:27.870463 2562 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 24 16:42:27.870553 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.870538 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7427106d-6c5c-457d-a96f-8d79db7264aa-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.870607 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:42:27.870545 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-secret-alertmanager-main-tls podName:7427106d-6c5c-457d-a96f-8d79db7264aa nodeName:}" failed. No retries permitted until 2026-04-24 16:42:28.370525908 +0000 UTC m=+200.827839763 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "7427106d-6c5c-457d-a96f-8d79db7264aa") : secret "alertmanager-main-tls" not found Apr 24 16:42:27.872241 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.871693 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7427106d-6c5c-457d-a96f-8d79db7264aa-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.873415 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.873378 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7427106d-6c5c-457d-a96f-8d79db7264aa-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.875647 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.875585 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-web-config\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.875647 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.875585 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.875839 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.875650 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.876049 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.876027 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-config-volume\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.877100 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.877058 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.877238 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.877221 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.879112 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.877397 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7427106d-6c5c-457d-a96f-8d79db7264aa-config-out\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:27.887314 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:27.882705 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm288\" (UniqueName: \"kubernetes.io/projected/7427106d-6c5c-457d-a96f-8d79db7264aa-kube-api-access-qm288\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:28.375542 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:28.375473 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:28.379511 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:28.379435 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7427106d-6c5c-457d-a96f-8d79db7264aa-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7427106d-6c5c-457d-a96f-8d79db7264aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:28.518142 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:28.518108 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:42:28.677622 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:28.677592 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 16:42:28.678908 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:42:28.678878 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7427106d_6c5c_457d_a96f_8d79db7264aa.slice/crio-8f44e8f585101ed12a325f937e2c32e987eae089d8ca549eba7c5043a738b255 WatchSource:0}: Error finding container 8f44e8f585101ed12a325f937e2c32e987eae089d8ca549eba7c5043a738b255: Status 404 returned error can't find the container with id 8f44e8f585101ed12a325f937e2c32e987eae089d8ca549eba7c5043a738b255 Apr 24 16:42:28.800268 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:28.800169 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" event={"ID":"c67686de-860d-4144-a49a-f0d703568add","Type":"ContainerStarted","Data":"4f4be6b61460c8c165557029f280fdae08d00c50668254911bb8daa8a02e642b"} Apr 24 16:42:28.800268 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:28.800211 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" event={"ID":"c67686de-860d-4144-a49a-f0d703568add","Type":"ContainerStarted","Data":"0596eed5c66f085e6e29e053365e8ec30123dfae94507581a85ae1599a28ab46"} Apr 24 16:42:28.800268 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:28.800227 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" event={"ID":"c67686de-860d-4144-a49a-f0d703568add","Type":"ContainerStarted","Data":"7e18ee0babd918c8cd1d993b3d3c56f607a774d74cbc14d1661036980624a448"} Apr 24 16:42:28.801208 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:28.801183 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7427106d-6c5c-457d-a96f-8d79db7264aa","Type":"ContainerStarted","Data":"8f44e8f585101ed12a325f937e2c32e987eae089d8ca549eba7c5043a738b255"} Apr 24 16:42:28.802447 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:28.802423 2562 generic.go:358] "Generic (PLEG): container finished" podID="f15e46f0-058b-4e83-8863-31e18b978144" containerID="9b899edc6ee5fd2315cba921c4583d9f03141218969799d740320aa44809e0db" exitCode=0 Apr 24 16:42:28.802508 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:28.802469 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lq8jc" event={"ID":"f15e46f0-058b-4e83-8863-31e18b978144","Type":"ContainerDied","Data":"9b899edc6ee5fd2315cba921c4583d9f03141218969799d740320aa44809e0db"} Apr 24 16:42:28.822637 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:28.822589 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-s6tzg" podStartSLOduration=1.6075743999999998 podStartE2EDuration="2.822575493s" podCreationTimestamp="2026-04-24 16:42:26 +0000 UTC" firstStartedPulling="2026-04-24 16:42:26.998509217 +0000 UTC m=+199.455823072" lastFinishedPulling="2026-04-24 16:42:28.213510312 +0000 UTC m=+200.670824165" observedRunningTime="2026-04-24 16:42:28.820471397 +0000 UTC m=+201.277785272" watchObservedRunningTime="2026-04-24 16:42:28.822575493 +0000 UTC m=+201.279889363" Apr 24 16:42:28.863351 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:28.863310 2562 patch_prober.go:28] interesting pod/console-7f6b6b6c98-mzqb5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.21:8443/health\": dial tcp 10.133.0.21:8443: connect: connection refused" start-of-body= Apr 24 16:42:28.863469 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:28.863356 2562 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-7f6b6b6c98-mzqb5" podUID="ba76f817-dd2f-4bdd-ba73-a83baa5abee0" containerName="console" probeResult="failure" output="Get \"https://10.133.0.21:8443/health\": dial tcp 10.133.0.21:8443: connect: connection refused" Apr 24 16:42:29.806950 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:29.806905 2562 generic.go:358] "Generic (PLEG): container finished" podID="7427106d-6c5c-457d-a96f-8d79db7264aa" containerID="79eab1e17f135c06a83f990e93b6e8d2b2e981339a94bcfe9b12c34c77b75524" exitCode=0 Apr 24 16:42:29.807356 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:29.806988 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7427106d-6c5c-457d-a96f-8d79db7264aa","Type":"ContainerDied","Data":"79eab1e17f135c06a83f990e93b6e8d2b2e981339a94bcfe9b12c34c77b75524"} Apr 24 16:42:29.808998 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:29.808972 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lq8jc" event={"ID":"f15e46f0-058b-4e83-8863-31e18b978144","Type":"ContainerStarted","Data":"ac48b3a2917c7f3b2653f494c2236df77a7be926ae830060dbe86ae6b4f6ecfa"} Apr 24 16:42:29.809070 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:29.809008 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lq8jc" event={"ID":"f15e46f0-058b-4e83-8863-31e18b978144","Type":"ContainerStarted","Data":"51eb87ed11c20db231250159e8cdceb66c1eed90219657f13e318e5fb4d7831f"} Apr 24 16:42:29.853034 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:29.852981 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-lq8jc" podStartSLOduration=3.088209793 podStartE2EDuration="3.852965696s" podCreationTimestamp="2026-04-24 16:42:26 +0000 UTC" firstStartedPulling="2026-04-24 16:42:26.90110872 +0000 UTC m=+199.358422569" lastFinishedPulling="2026-04-24 16:42:27.665864611 +0000 UTC m=+200.123178472" observedRunningTime="2026-04-24 16:42:29.851793806 +0000 UTC m=+202.309107677" watchObservedRunningTime="2026-04-24 16:42:29.852965696 +0000 UTC m=+202.310279567" Apr 24 16:42:30.839127 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:30.839092 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-86bb488774-stnw5"] Apr 24 16:42:30.842975 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:30.842904 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:30.845499 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:30.845473 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dgcen7hkhniv9\"" Apr 24 16:42:30.846556 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:30.846533 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 16:42:30.846735 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:30.846711 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 16:42:30.846813 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:30.846781 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 16:42:30.846875 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:30.846783 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-v4zgh\"" Apr 24 16:42:30.846875 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:30.846854 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 16:42:30.851333 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:30.851249 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-86bb488774-stnw5"] Apr 24 16:42:31.006567 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.006531 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/83beecc5-9867-43c6-8b53-c7cd037c42b1-audit-log\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.006719 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.006591 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83beecc5-9867-43c6-8b53-c7cd037c42b1-client-ca-bundle\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.006719 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.006640 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kbfl\" (UniqueName: \"kubernetes.io/projected/83beecc5-9867-43c6-8b53-c7cd037c42b1-kube-api-access-6kbfl\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.006719 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.006692 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/83beecc5-9867-43c6-8b53-c7cd037c42b1-secret-metrics-server-tls\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.006820 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.006796 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83beecc5-9867-43c6-8b53-c7cd037c42b1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.006858 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.006823 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/83beecc5-9867-43c6-8b53-c7cd037c42b1-metrics-server-audit-profiles\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.006890 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.006866 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/83beecc5-9867-43c6-8b53-c7cd037c42b1-secret-metrics-server-client-certs\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.107898 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.107868 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/83beecc5-9867-43c6-8b53-c7cd037c42b1-audit-log\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.108031 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.107990 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83beecc5-9867-43c6-8b53-c7cd037c42b1-client-ca-bundle\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.108031 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.108020 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kbfl\" (UniqueName: \"kubernetes.io/projected/83beecc5-9867-43c6-8b53-c7cd037c42b1-kube-api-access-6kbfl\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.108152 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.108052 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/83beecc5-9867-43c6-8b53-c7cd037c42b1-secret-metrics-server-tls\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.108152 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.108139 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83beecc5-9867-43c6-8b53-c7cd037c42b1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.108254 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.108170 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/83beecc5-9867-43c6-8b53-c7cd037c42b1-metrics-server-audit-profiles\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.108254 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.108219 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/83beecc5-9867-43c6-8b53-c7cd037c42b1-secret-metrics-server-client-certs\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.108367 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.108294 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/83beecc5-9867-43c6-8b53-c7cd037c42b1-audit-log\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.108841 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.108795 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83beecc5-9867-43c6-8b53-c7cd037c42b1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.109286 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.109262 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/83beecc5-9867-43c6-8b53-c7cd037c42b1-metrics-server-audit-profiles\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.111162 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.111137 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/83beecc5-9867-43c6-8b53-c7cd037c42b1-secret-metrics-server-client-certs\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.111254 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.111190 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83beecc5-9867-43c6-8b53-c7cd037c42b1-client-ca-bundle\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.111293 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.111263 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/83beecc5-9867-43c6-8b53-c7cd037c42b1-secret-metrics-server-tls\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.116400 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.116377 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kbfl\" (UniqueName: \"kubernetes.io/projected/83beecc5-9867-43c6-8b53-c7cd037c42b1-kube-api-access-6kbfl\") pod \"metrics-server-86bb488774-stnw5\" (UID: \"83beecc5-9867-43c6-8b53-c7cd037c42b1\") " pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.155829 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.155550 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:31.297782 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.297712 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-86bb488774-stnw5"] Apr 24 16:42:31.300907 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:42:31.300829 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83beecc5_9867_43c6_8b53_c7cd037c42b1.slice/crio-9b5dc8905d2450d4de233b43731297f8a8def86a9cc28bc6d611e03fa689e67a WatchSource:0}: Error finding container 9b5dc8905d2450d4de233b43731297f8a8def86a9cc28bc6d611e03fa689e67a: Status 404 returned error can't find the container with id 9b5dc8905d2450d4de233b43731297f8a8def86a9cc28bc6d611e03fa689e67a Apr 24 16:42:31.317538 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.317227 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f6b6b6c98-mzqb5"] Apr 24 16:42:31.349190 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.349163 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-xrwb9"] Apr 24 16:42:31.352716 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.352697 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xrwb9" Apr 24 16:42:31.356882 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.356849 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-krz97\"" Apr 24 16:42:31.357162 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.357057 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 16:42:31.377773 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.377745 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-xrwb9"] Apr 24 16:42:31.512278 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.512231 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4670a4ea-1ec1-494f-9df8-e887515f6638-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xrwb9\" (UID: \"4670a4ea-1ec1-494f-9df8-e887515f6638\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xrwb9" Apr 24 16:42:31.613430 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.613334 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4670a4ea-1ec1-494f-9df8-e887515f6638-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xrwb9\" (UID: \"4670a4ea-1ec1-494f-9df8-e887515f6638\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xrwb9" Apr 24 16:42:31.615754 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.615736 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4670a4ea-1ec1-494f-9df8-e887515f6638-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xrwb9\" (UID: \"4670a4ea-1ec1-494f-9df8-e887515f6638\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xrwb9" Apr 24 16:42:31.665758 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.665720 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xrwb9" Apr 24 16:42:31.786216 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.786184 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-xrwb9"] Apr 24 16:42:31.789128 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:42:31.789091 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4670a4ea_1ec1_494f_9df8_e887515f6638.slice/crio-a40eee5188804f1e52f3321a7646622cdd5ece4b467ac0d18059597ff35b2a02 WatchSource:0}: Error finding container a40eee5188804f1e52f3321a7646622cdd5ece4b467ac0d18059597ff35b2a02: Status 404 returned error can't find the container with id a40eee5188804f1e52f3321a7646622cdd5ece4b467ac0d18059597ff35b2a02 Apr 24 16:42:31.817186 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.817153 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xrwb9" event={"ID":"4670a4ea-1ec1-494f-9df8-e887515f6638","Type":"ContainerStarted","Data":"a40eee5188804f1e52f3321a7646622cdd5ece4b467ac0d18059597ff35b2a02"} Apr 24 16:42:31.818399 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.818369 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-86bb488774-stnw5" event={"ID":"83beecc5-9867-43c6-8b53-c7cd037c42b1","Type":"ContainerStarted","Data":"9b5dc8905d2450d4de233b43731297f8a8def86a9cc28bc6d611e03fa689e67a"} Apr 24 16:42:31.821393 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.821371 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7427106d-6c5c-457d-a96f-8d79db7264aa","Type":"ContainerStarted","Data":"60297b166e9545f2319484c3085efa5af7ba6935a3f2d9e9bae6f18b20d95757"} Apr 24 16:42:31.821495 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.821397 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7427106d-6c5c-457d-a96f-8d79db7264aa","Type":"ContainerStarted","Data":"bb64a34b9a3b2f27135a2b670b98fbf8128ef974260541a2b6eed497e9515af5"} Apr 24 16:42:31.821495 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.821406 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7427106d-6c5c-457d-a96f-8d79db7264aa","Type":"ContainerStarted","Data":"9fcdc28998d4d7ffd24cb009f7a5f9b48022b80532a187f8d3fab87af214d3d1"} Apr 24 16:42:31.821495 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.821414 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7427106d-6c5c-457d-a96f-8d79db7264aa","Type":"ContainerStarted","Data":"390b896ef4f072d4e5fa4846915a482727a50a9d346673f27f3abc82b3b87afe"} Apr 24 16:42:31.821495 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:31.821422 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7427106d-6c5c-457d-a96f-8d79db7264aa","Type":"ContainerStarted","Data":"26b5d304caef3cdff1be5b5b515143c4581497085f123898951cac2b15491802"} Apr 24 16:42:32.801947 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.799962 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 16:42:32.807316 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.807289 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.810728 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.810705 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 16:42:32.811187 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.811167 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 16:42:32.811488 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.811258 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 16:42:32.811488 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.811260 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 16:42:32.811820 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.811786 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 16:42:32.812313 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.812278 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 16:42:32.812313 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.812298 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 16:42:32.812498 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.812281 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 16:42:32.812498 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.812282 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bj98dnc8o5bup\"" Apr 24 16:42:32.812648 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.812577 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 16:42:32.812813 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.812796 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-n6j58\"" Apr 24 16:42:32.813121 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.813100 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 16:42:32.813756 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.813738 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 16:42:32.816433 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.816386 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 16:42:32.824555 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.824534 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-config\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.824664 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.824561 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.824664 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.824606 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.824664 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.824633 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.824820 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.824669 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.824820 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.824690 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.824820 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.824716 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-web-config\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.824820 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.824746 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.824820 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.824782 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.824820 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.824817 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.825232 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.824871 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.825232 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.824891 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.825232 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.824907 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.825232 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.824961 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.825232 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.825040 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.825232 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.825082 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-config-out\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.825232 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.825153 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.825232 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.825180 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zftq\" (UniqueName: \"kubernetes.io/projected/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-kube-api-access-2zftq\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.828335 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.828307 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 16:42:32.829209 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.829180 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7427106d-6c5c-457d-a96f-8d79db7264aa","Type":"ContainerStarted","Data":"cbe8ac2ef2fd642f8458c1a9648c6e5d5968ca83c58d673ba7a8e546184b74d2"} Apr 24 16:42:32.885275 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.885211 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.383910787 podStartE2EDuration="5.885190275s" podCreationTimestamp="2026-04-24 16:42:27 +0000 UTC" firstStartedPulling="2026-04-24 16:42:28.681175073 +0000 UTC m=+201.138488922" lastFinishedPulling="2026-04-24 16:42:32.182454548 +0000 UTC m=+204.639768410" observedRunningTime="2026-04-24 16:42:32.883073759 +0000 UTC m=+205.340387654" watchObservedRunningTime="2026-04-24 16:42:32.885190275 +0000 UTC m=+205.342504189" Apr 24 16:42:32.925922 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.925883 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.926124 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.925925 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-config-out\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.926124 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.926106 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.926233 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.926131 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zftq\" (UniqueName: \"kubernetes.io/projected/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-kube-api-access-2zftq\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.926287 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.926272 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-config\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.926340 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.926297 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.926404 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.926351 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.926404 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.926377 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.926503 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.926443 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.926503 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.926472 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.926598 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.926504 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-web-config\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.926598 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.926565 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.926699 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.926609 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.926699 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.926636 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.926699 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.926678 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.926699 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.926695 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.926893 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.926722 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.926893 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.926750 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.926893 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.926789 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.927648 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.927328 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.928407 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.928111 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.930103 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.929388 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.930103 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.929463 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.930103 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.929961 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.930313 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.930178 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.930884 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.930560 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-config-out\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.930884 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.930732 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.931379 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.931355 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.933422 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.933377 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.933422 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.933413 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-web-config\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.933612 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.933590 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.934039 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.934014 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.934458 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.934416 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.935761 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.935741 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.938298 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.938278 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-config\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:32.938399 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:32.938331 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zftq\" (UniqueName: \"kubernetes.io/projected/5a2a3f3b-e44b-4700-b5ff-26ebc37982d3-kube-api-access-2zftq\") pod \"prometheus-k8s-0\" (UID: \"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:33.119720 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:33.119691 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:33.258466 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:33.258439 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 16:42:33.260648 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:42:33.260619 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a2a3f3b_e44b_4700_b5ff_26ebc37982d3.slice/crio-4d4a629c59021f6223a4e31281efe23ea3881804a224aa281c0337cce204d385 WatchSource:0}: Error finding container 4d4a629c59021f6223a4e31281efe23ea3881804a224aa281c0337cce204d385: Status 404 returned error can't find the container with id 4d4a629c59021f6223a4e31281efe23ea3881804a224aa281c0337cce204d385 Apr 24 16:42:33.833954 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:33.833892 2562 generic.go:358] "Generic (PLEG): container finished" podID="5a2a3f3b-e44b-4700-b5ff-26ebc37982d3" containerID="8fbcbbf285b32cf6082ccddd42d7b0cfb465d0fd83e165bdbdf10eee9296853e" exitCode=0 Apr 24 16:42:33.834415 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:33.833979 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3","Type":"ContainerDied","Data":"8fbcbbf285b32cf6082ccddd42d7b0cfb465d0fd83e165bdbdf10eee9296853e"} Apr 24 16:42:33.834415 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:33.834009 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3","Type":"ContainerStarted","Data":"4d4a629c59021f6223a4e31281efe23ea3881804a224aa281c0337cce204d385"} Apr 24 16:42:33.835432 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:33.835397 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-86bb488774-stnw5" event={"ID":"83beecc5-9867-43c6-8b53-c7cd037c42b1","Type":"ContainerStarted","Data":"f3cf643621f879c96bf0b72dd134c92ee9690aae526e7f1cf6483d560151aff7"} Apr 24 16:42:33.836869 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:33.836842 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xrwb9" event={"ID":"4670a4ea-1ec1-494f-9df8-e887515f6638","Type":"ContainerStarted","Data":"b5197f786e3cc10944644346806e1d28afbb266106d374c58bb6e75cb5b6ffb6"} Apr 24 16:42:33.837311 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:33.837296 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xrwb9" Apr 24 16:42:33.841672 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:33.841645 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xrwb9" Apr 24 16:42:33.875026 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:33.874978 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xrwb9" podStartSLOduration=1.579342536 podStartE2EDuration="2.87496423s" podCreationTimestamp="2026-04-24 16:42:31 +0000 UTC" firstStartedPulling="2026-04-24 16:42:31.79134034 +0000 UTC m=+204.248654188" lastFinishedPulling="2026-04-24 16:42:33.086962032 +0000 UTC m=+205.544275882" observedRunningTime="2026-04-24 16:42:33.874521165 +0000 UTC m=+206.331835038" watchObservedRunningTime="2026-04-24 16:42:33.87496423 +0000 UTC m=+206.332278102" Apr 24 16:42:33.895157 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:33.895110 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-86bb488774-stnw5" podStartSLOduration=2.167947537 podStartE2EDuration="3.895095697s" podCreationTimestamp="2026-04-24 16:42:30 +0000 UTC" firstStartedPulling="2026-04-24 16:42:31.303670982 +0000 UTC m=+203.760984833" lastFinishedPulling="2026-04-24 16:42:33.030819144 +0000 UTC m=+205.488132993" observedRunningTime="2026-04-24 16:42:33.894790021 +0000 UTC m=+206.352103893" watchObservedRunningTime="2026-04-24 16:42:33.895095697 +0000 UTC m=+206.352409564" Apr 24 16:42:36.850749 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:36.850665 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3","Type":"ContainerStarted","Data":"86379bfb85d3451d3fac48003adf5d3f5c17245a10f0c4ef284c75caa5bbd9cd"} Apr 24 16:42:36.850749 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:36.850707 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3","Type":"ContainerStarted","Data":"5b238e8324e1bcde63d5b9a641cc110cd5fdcf92b98b61903e1873dde4795371"} Apr 24 16:42:38.859582 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:38.859545 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3","Type":"ContainerStarted","Data":"2020f5cba1f1d8773fdfd7b3592a0fc39ace9128ca4327e4ef82b70aa84b095d"} Apr 24 16:42:38.859582 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:38.859584 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3","Type":"ContainerStarted","Data":"2df0569792607f716bf9e9ec5ae443e515ee4a421804168dd669dd907d198989"} Apr 24 16:42:38.860001 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:38.859594 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3","Type":"ContainerStarted","Data":"b97d778a7fadb5ae1a219adcc2f2024b16bb18946bdf802efbdacf01f5c7aef7"} Apr 24 16:42:38.860001 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:38.859604 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5a2a3f3b-e44b-4700-b5ff-26ebc37982d3","Type":"ContainerStarted","Data":"d6ce910cb9999108a99a6badc91db552d434cff2b89f0265e33afed392094d2c"} Apr 24 16:42:38.894040 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:38.893988 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.490602505 podStartE2EDuration="6.893972146s" podCreationTimestamp="2026-04-24 16:42:32 +0000 UTC" firstStartedPulling="2026-04-24 16:42:33.835307556 +0000 UTC m=+206.292621405" lastFinishedPulling="2026-04-24 16:42:38.238677183 +0000 UTC m=+210.695991046" observedRunningTime="2026-04-24 16:42:38.892151481 +0000 UTC m=+211.349465371" watchObservedRunningTime="2026-04-24 16:42:38.893972146 +0000 UTC m=+211.351286017" Apr 24 16:42:43.120369 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:43.120328 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:51.156373 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:51.156328 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:51.156373 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:51.156375 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:42:56.343013 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.342946 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7f6b6b6c98-mzqb5" podUID="ba76f817-dd2f-4bdd-ba73-a83baa5abee0" containerName="console" containerID="cri-o://b8aad618ffc46f5b611ad8d128ff9ca6f43567a7f2f9a45aaf94b7e5d4e63846" gracePeriod=15 Apr 24 16:42:56.578064 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.578042 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f6b6b6c98-mzqb5_ba76f817-dd2f-4bdd-ba73-a83baa5abee0/console/0.log" Apr 24 16:42:56.578182 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.578118 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:56.677303 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.677206 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-oauth-serving-cert\") pod \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " Apr 24 16:42:56.677303 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.677257 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-console-config\") pod \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " Apr 24 16:42:56.677544 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.677318 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-console-oauth-config\") pod \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " Apr 24 16:42:56.677544 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.677347 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-console-serving-cert\") pod \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " Apr 24 16:42:56.677544 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.677373 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-service-ca\") pod \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " Apr 24 16:42:56.677544 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.677407 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llsf8\" (UniqueName: \"kubernetes.io/projected/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-kube-api-access-llsf8\") pod \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\" (UID: \"ba76f817-dd2f-4bdd-ba73-a83baa5abee0\") " Apr 24 16:42:56.677739 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.677694 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-console-config" (OuterVolumeSpecName: "console-config") pod "ba76f817-dd2f-4bdd-ba73-a83baa5abee0" (UID: "ba76f817-dd2f-4bdd-ba73-a83baa5abee0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:42:56.677788 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.677737 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ba76f817-dd2f-4bdd-ba73-a83baa5abee0" (UID: "ba76f817-dd2f-4bdd-ba73-a83baa5abee0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:42:56.677830 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.677791 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-service-ca" (OuterVolumeSpecName: "service-ca") pod "ba76f817-dd2f-4bdd-ba73-a83baa5abee0" (UID: "ba76f817-dd2f-4bdd-ba73-a83baa5abee0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:42:56.677901 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.677885 2562 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-service-ca\") on node \"ip-10-0-143-144.ec2.internal\" DevicePath \"\"" Apr 24 16:42:56.677978 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.677911 2562 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-oauth-serving-cert\") on node \"ip-10-0-143-144.ec2.internal\" DevicePath \"\"" Apr 24 16:42:56.677978 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.677927 2562 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-console-config\") on node \"ip-10-0-143-144.ec2.internal\" DevicePath \"\"" Apr 24 16:42:56.679724 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.679698 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-kube-api-access-llsf8" (OuterVolumeSpecName: "kube-api-access-llsf8") pod "ba76f817-dd2f-4bdd-ba73-a83baa5abee0" (UID: "ba76f817-dd2f-4bdd-ba73-a83baa5abee0"). InnerVolumeSpecName "kube-api-access-llsf8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:42:56.679807 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.679724 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ba76f817-dd2f-4bdd-ba73-a83baa5abee0" (UID: "ba76f817-dd2f-4bdd-ba73-a83baa5abee0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:42:56.679807 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.679776 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ba76f817-dd2f-4bdd-ba73-a83baa5abee0" (UID: "ba76f817-dd2f-4bdd-ba73-a83baa5abee0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:42:56.779179 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.779144 2562 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-console-oauth-config\") on node \"ip-10-0-143-144.ec2.internal\" DevicePath \"\"" Apr 24 16:42:56.779179 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.779173 2562 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-console-serving-cert\") on node \"ip-10-0-143-144.ec2.internal\" DevicePath \"\"" Apr 24 16:42:56.779179 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.779184 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-llsf8\" (UniqueName: \"kubernetes.io/projected/ba76f817-dd2f-4bdd-ba73-a83baa5abee0-kube-api-access-llsf8\") on node \"ip-10-0-143-144.ec2.internal\" DevicePath \"\"" Apr 24 16:42:56.914262 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.914236 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f6b6b6c98-mzqb5_ba76f817-dd2f-4bdd-ba73-a83baa5abee0/console/0.log" Apr 24 16:42:56.914447 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.914273 2562 generic.go:358] "Generic (PLEG): container finished" podID="ba76f817-dd2f-4bdd-ba73-a83baa5abee0" containerID="b8aad618ffc46f5b611ad8d128ff9ca6f43567a7f2f9a45aaf94b7e5d4e63846" exitCode=2 Apr 24 16:42:56.914447 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.914306 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f6b6b6c98-mzqb5" event={"ID":"ba76f817-dd2f-4bdd-ba73-a83baa5abee0","Type":"ContainerDied","Data":"b8aad618ffc46f5b611ad8d128ff9ca6f43567a7f2f9a45aaf94b7e5d4e63846"} Apr 24 16:42:56.914447 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.914339 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f6b6b6c98-mzqb5" Apr 24 16:42:56.914447 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.914352 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f6b6b6c98-mzqb5" event={"ID":"ba76f817-dd2f-4bdd-ba73-a83baa5abee0","Type":"ContainerDied","Data":"d993a3e3b72c3b21a1c99c2323fe643d43939e7f5f975709b03c8d92b62f1e4b"} Apr 24 16:42:56.914447 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.914371 2562 scope.go:117] "RemoveContainer" containerID="b8aad618ffc46f5b611ad8d128ff9ca6f43567a7f2f9a45aaf94b7e5d4e63846" Apr 24 16:42:56.922743 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.922723 2562 scope.go:117] "RemoveContainer" containerID="b8aad618ffc46f5b611ad8d128ff9ca6f43567a7f2f9a45aaf94b7e5d4e63846" Apr 24 16:42:56.923056 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:42:56.923036 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8aad618ffc46f5b611ad8d128ff9ca6f43567a7f2f9a45aaf94b7e5d4e63846\": container with ID starting with b8aad618ffc46f5b611ad8d128ff9ca6f43567a7f2f9a45aaf94b7e5d4e63846 not found: ID does not exist" containerID="b8aad618ffc46f5b611ad8d128ff9ca6f43567a7f2f9a45aaf94b7e5d4e63846" Apr 24 16:42:56.923109 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.923064 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8aad618ffc46f5b611ad8d128ff9ca6f43567a7f2f9a45aaf94b7e5d4e63846"} err="failed to get container status \"b8aad618ffc46f5b611ad8d128ff9ca6f43567a7f2f9a45aaf94b7e5d4e63846\": rpc error: code = NotFound desc = could not find container \"b8aad618ffc46f5b611ad8d128ff9ca6f43567a7f2f9a45aaf94b7e5d4e63846\": container with ID starting with b8aad618ffc46f5b611ad8d128ff9ca6f43567a7f2f9a45aaf94b7e5d4e63846 not found: ID does not exist" Apr 24 16:42:56.936635 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.936576 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f6b6b6c98-mzqb5"] Apr 24 16:42:56.938639 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:56.938621 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7f6b6b6c98-mzqb5"] Apr 24 16:42:58.177729 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:42:58.177695 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba76f817-dd2f-4bdd-ba73-a83baa5abee0" path="/var/lib/kubelet/pods/ba76f817-dd2f-4bdd-ba73-a83baa5abee0/volumes" Apr 24 16:43:01.932550 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:01.932511 2562 generic.go:358] "Generic (PLEG): container finished" podID="e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9" containerID="40bc2f02878c2b5ef91a42fb883073cfe53856ac285a66227ac8a2e284d3ede8" exitCode=0 Apr 24 16:43:01.933072 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:01.932588 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6" event={"ID":"e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9","Type":"ContainerDied","Data":"40bc2f02878c2b5ef91a42fb883073cfe53856ac285a66227ac8a2e284d3ede8"} Apr 24 16:43:01.933072 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:01.933056 2562 scope.go:117] "RemoveContainer" containerID="40bc2f02878c2b5ef91a42fb883073cfe53856ac285a66227ac8a2e284d3ede8" Apr 24 16:43:01.933912 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:01.933890 2562 generic.go:358] "Generic (PLEG): container finished" podID="bc664fe7-8e82-414a-88b1-9faec08dd651" containerID="b4605385de9cf7f65fafeb6904ba99e7a16db1c6718bdf78e25d29a5906dd16a" exitCode=0 Apr 24 16:43:01.934026 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:01.933958 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-jltdt" event={"ID":"bc664fe7-8e82-414a-88b1-9faec08dd651","Type":"ContainerDied","Data":"b4605385de9cf7f65fafeb6904ba99e7a16db1c6718bdf78e25d29a5906dd16a"} Apr 24 16:43:01.934276 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:01.934262 2562 scope.go:117] "RemoveContainer" containerID="b4605385de9cf7f65fafeb6904ba99e7a16db1c6718bdf78e25d29a5906dd16a" Apr 24 16:43:02.939212 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:02.939178 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6kdn6" event={"ID":"e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9","Type":"ContainerStarted","Data":"7301c29aea37c3a28e18bf1dcc326568ebd403b713905270a75a2cdd0ce9a7cf"} Apr 24 16:43:02.940824 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:02.940802 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-jltdt" event={"ID":"bc664fe7-8e82-414a-88b1-9faec08dd651","Type":"ContainerStarted","Data":"3ae6a059fa7d67c7ef80df2e2726a12d440b7c1418a74cc2641ef509962e5e89"} Apr 24 16:43:04.790049 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:04.790022 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7427106d-6c5c-457d-a96f-8d79db7264aa/init-config-reloader/0.log" Apr 24 16:43:04.989621 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:04.989578 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7427106d-6c5c-457d-a96f-8d79db7264aa/alertmanager/0.log" Apr 24 16:43:05.189979 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:05.189880 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7427106d-6c5c-457d-a96f-8d79db7264aa/config-reloader/0.log" Apr 24 16:43:05.392578 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:05.392544 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7427106d-6c5c-457d-a96f-8d79db7264aa/kube-rbac-proxy-web/0.log" Apr 24 16:43:05.589246 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:05.589216 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7427106d-6c5c-457d-a96f-8d79db7264aa/kube-rbac-proxy/0.log" Apr 24 16:43:05.789293 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:05.789258 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7427106d-6c5c-457d-a96f-8d79db7264aa/kube-rbac-proxy-metric/0.log" Apr 24 16:43:05.989684 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:05.989657 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7427106d-6c5c-457d-a96f-8d79db7264aa/prom-label-proxy/0.log" Apr 24 16:43:06.191150 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:06.191117 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-n5n4w_cc0177b3-ada2-4478-8a4f-354de52d1414/cluster-monitoring-operator/0.log" Apr 24 16:43:06.389881 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:06.389801 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-s6tzg_c67686de-860d-4144-a49a-f0d703568add/kube-state-metrics/0.log" Apr 24 16:43:06.590004 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:06.589975 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-s6tzg_c67686de-860d-4144-a49a-f0d703568add/kube-rbac-proxy-main/0.log" Apr 24 16:43:06.790895 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:06.790865 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-s6tzg_c67686de-860d-4144-a49a-f0d703568add/kube-rbac-proxy-self/0.log" Apr 24 16:43:06.954200 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:06.954165 2562 generic.go:358] "Generic (PLEG): container finished" podID="d0708a89-9e11-49df-97d8-8cbf72ad25dd" containerID="ade35193d9d605c2d96856b854db8da47c59b66bad823e9f133417b39307962a" exitCode=0 Apr 24 16:43:06.954399 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:06.954239 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf" event={"ID":"d0708a89-9e11-49df-97d8-8cbf72ad25dd","Type":"ContainerDied","Data":"ade35193d9d605c2d96856b854db8da47c59b66bad823e9f133417b39307962a"} Apr 24 16:43:06.954645 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:06.954628 2562 scope.go:117] "RemoveContainer" containerID="ade35193d9d605c2d96856b854db8da47c59b66bad823e9f133417b39307962a" Apr 24 16:43:06.989306 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:06.989280 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-86bb488774-stnw5_83beecc5-9867-43c6-8b53-c7cd037c42b1/metrics-server/0.log" Apr 24 16:43:07.195628 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:07.195595 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-xrwb9_4670a4ea-1ec1-494f-9df8-e887515f6638/monitoring-plugin/0.log" Apr 24 16:43:07.958465 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:07.958424 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nwdmf" event={"ID":"d0708a89-9e11-49df-97d8-8cbf72ad25dd","Type":"ContainerStarted","Data":"850dba4f02f6f5a45492e80dfbbc0f6fc538277ab445225a0f3800dc65abd503"} Apr 24 16:43:07.991904 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:07.991878 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lq8jc_f15e46f0-058b-4e83-8863-31e18b978144/init-textfile/0.log" Apr 24 16:43:08.190205 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:08.190178 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lq8jc_f15e46f0-058b-4e83-8863-31e18b978144/node-exporter/0.log" Apr 24 16:43:08.389715 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:08.389681 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lq8jc_f15e46f0-058b-4e83-8863-31e18b978144/kube-rbac-proxy/0.log" Apr 24 16:43:09.191161 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:09.191126 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5a2a3f3b-e44b-4700-b5ff-26ebc37982d3/init-config-reloader/0.log" Apr 24 16:43:09.392300 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:09.392266 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5a2a3f3b-e44b-4700-b5ff-26ebc37982d3/prometheus/0.log" Apr 24 16:43:09.590186 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:09.590152 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5a2a3f3b-e44b-4700-b5ff-26ebc37982d3/config-reloader/0.log" Apr 24 16:43:09.789755 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:09.789727 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5a2a3f3b-e44b-4700-b5ff-26ebc37982d3/thanos-sidecar/0.log" Apr 24 16:43:09.989368 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:09.989337 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5a2a3f3b-e44b-4700-b5ff-26ebc37982d3/kube-rbac-proxy-web/0.log" Apr 24 16:43:10.189748 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:10.189717 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5a2a3f3b-e44b-4700-b5ff-26ebc37982d3/kube-rbac-proxy/0.log" Apr 24 16:43:10.391043 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:10.390955 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5a2a3f3b-e44b-4700-b5ff-26ebc37982d3/kube-rbac-proxy-thanos/0.log" Apr 24 16:43:10.989712 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:10.989672 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-hx5th_da0109d3-7192-4ad8-a35a-b4c4e4438cbe/prometheus-operator-admission-webhook/0.log" Apr 24 16:43:11.161827 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:11.161800 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:43:11.165719 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:11.165691 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-86bb488774-stnw5" Apr 24 16:43:12.589754 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:12.589721 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-99ws4_7235e10c-761b-4d2d-a4f9-2d8114898c5d/console-operator/1.log" Apr 24 16:43:12.792330 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:12.792277 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-99ws4_7235e10c-761b-4d2d-a4f9-2d8114898c5d/console-operator/2.log" Apr 24 16:43:14.591443 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:14.591388 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m9vmw_9a9da436-211c-45bd-9f7b-51e5eea9f69e/dns-node-resolver/0.log" Apr 24 16:43:15.389438 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:15.389401 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-w297f_e5357c94-3bc4-4825-808b-68599eb79e96/node-ca/0.log" Apr 24 16:43:15.590238 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:15.590205 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-d8c574-57dx2_04503171-24fb-471c-9dcc-be6c1d1b3331/router/0.log" Apr 24 16:43:19.899760 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:19.899723 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs\") pod \"network-metrics-daemon-wp5zn\" (UID: \"93f47728-cffc-4da9-9791-92c3d70ac2d2\") " pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:43:19.901991 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:19.901962 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93f47728-cffc-4da9-9791-92c3d70ac2d2-metrics-certs\") pod \"network-metrics-daemon-wp5zn\" (UID: \"93f47728-cffc-4da9-9791-92c3d70ac2d2\") " pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:43:20.077270 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:20.077241 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xv2x8\"" Apr 24 16:43:20.085368 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:20.085325 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wp5zn" Apr 24 16:43:20.251571 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:20.251542 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wp5zn"] Apr 24 16:43:20.254566 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:43:20.254510 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93f47728_cffc_4da9_9791_92c3d70ac2d2.slice/crio-8c8fea4e4b12131fa922595730abb973d705a4454a55c1f30a738bf7409253bd WatchSource:0}: Error finding container 8c8fea4e4b12131fa922595730abb973d705a4454a55c1f30a738bf7409253bd: Status 404 returned error can't find the container with id 8c8fea4e4b12131fa922595730abb973d705a4454a55c1f30a738bf7409253bd Apr 24 16:43:21.001198 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:21.001160 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wp5zn" event={"ID":"93f47728-cffc-4da9-9791-92c3d70ac2d2","Type":"ContainerStarted","Data":"8c8fea4e4b12131fa922595730abb973d705a4454a55c1f30a738bf7409253bd"} Apr 24 16:43:22.006419 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:22.006383 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wp5zn" event={"ID":"93f47728-cffc-4da9-9791-92c3d70ac2d2","Type":"ContainerStarted","Data":"e8a6e75fc391fde74f9e0f96b0e57238c288211e23169ab14238c6ba337432f2"} Apr 24 16:43:22.006419 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:22.006416 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wp5zn" event={"ID":"93f47728-cffc-4da9-9791-92c3d70ac2d2","Type":"ContainerStarted","Data":"1737d0760018e3678daa322308be49448ba62aec521ff7b8955ca47a675191b6"} Apr 24 16:43:22.023194 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:22.023141 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wp5zn" podStartSLOduration=253.027395142 podStartE2EDuration="4m14.023122638s" podCreationTimestamp="2026-04-24 16:39:08 +0000 UTC" firstStartedPulling="2026-04-24 16:43:20.256618984 +0000 UTC m=+252.713932850" lastFinishedPulling="2026-04-24 16:43:21.252346483 +0000 UTC m=+253.709660346" observedRunningTime="2026-04-24 16:43:22.022460587 +0000 UTC m=+254.479774459" watchObservedRunningTime="2026-04-24 16:43:22.023122638 +0000 UTC m=+254.480436511" Apr 24 16:43:33.120869 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:33.120816 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:33.136706 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:33.136679 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:34.062673 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:34.062645 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:47.650268 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:43:47.650218 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zgxq7" podUID="444f46c8-3c9a-4e72-8000-ca142ae511ef" Apr 24 16:43:47.650268 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:43:47.650221 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" podUID="e8f243a6-3111-4778-9ca9-092db5973836" Apr 24 16:43:47.650678 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:43:47.650217 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-dj6nz" podUID="bf7a70b6-9c3d-42f2-912b-ae46ee6adb21" Apr 24 16:43:48.094337 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:48.094305 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dj6nz" Apr 24 16:43:48.094509 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:48.094347 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:43:48.094509 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:48.094378 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:43:50.984825 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:50.984776 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-f75fb554d-q7kkf"] Apr 24 16:43:50.985587 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:50.985562 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba76f817-dd2f-4bdd-ba73-a83baa5abee0" containerName="console" Apr 24 16:43:50.985756 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:50.985742 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba76f817-dd2f-4bdd-ba73-a83baa5abee0" containerName="console" Apr 24 16:43:50.986099 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:50.986083 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba76f817-dd2f-4bdd-ba73-a83baa5abee0" containerName="console" Apr 24 16:43:50.993342 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:50.993318 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:50.996215 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:50.996085 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 16:43:50.996215 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:50.996172 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 16:43:50.996765 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:50.996745 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-sfmkt\"" Apr 24 16:43:50.997044 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:50.997024 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 16:43:50.997561 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:50.997533 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 16:43:50.997674 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:50.997454 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 16:43:50.998268 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:50.998239 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-f75fb554d-q7kkf"] Apr 24 16:43:51.007533 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.007489 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:43:51.007728 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.007541 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert\") pod \"ingress-canary-zgxq7\" (UID: \"444f46c8-3c9a-4e72-8000-ca142ae511ef\") " pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:43:51.007728 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.007600 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:43:51.009089 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.009039 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 16:43:51.011525 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.011505 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls\") pod \"image-registry-56d5dcfb77-vmzmz\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:43:51.011683 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.011582 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf7a70b6-9c3d-42f2-912b-ae46ee6adb21-metrics-tls\") pod \"dns-default-dj6nz\" (UID: \"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21\") " pod="openshift-dns/dns-default-dj6nz" Apr 24 16:43:51.028453 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.027750 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/444f46c8-3c9a-4e72-8000-ca142ae511ef-cert\") pod \"ingress-canary-zgxq7\" (UID: \"444f46c8-3c9a-4e72-8000-ca142ae511ef\") " pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:43:51.097785 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.097756 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-l6l2l\"" Apr 24 16:43:51.097978 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.097850 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-szcnf\"" Apr 24 16:43:51.097978 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.097885 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6n2jd\"" Apr 24 16:43:51.105542 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.105523 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zgxq7" Apr 24 16:43:51.105664 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.105564 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dj6nz" Apr 24 16:43:51.105664 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.105629 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:43:51.108560 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.108529 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.108670 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.108573 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-serving-certs-ca-bundle\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.108670 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.108600 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.108670 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.108647 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n774x\" (UniqueName: \"kubernetes.io/projected/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-kube-api-access-n774x\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.108841 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.108725 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-telemeter-client-tls\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.108841 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.108761 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-metrics-client-ca\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.108841 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.108794 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-federate-client-tls\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.108962 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.108871 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-secret-telemeter-client\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.210511 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.210333 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.210511 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.210426 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-serving-certs-ca-bundle\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.210511 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.210453 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.211456 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.210767 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n774x\" (UniqueName: \"kubernetes.io/projected/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-kube-api-access-n774x\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.211456 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.211131 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-telemeter-client-tls\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.211456 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.211168 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-metrics-client-ca\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.211456 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.211308 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-federate-client-tls\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.211456 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.211431 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-serving-certs-ca-bundle\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.212331 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.211636 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-secret-telemeter-client\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.212331 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.211656 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.212677 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.212658 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-metrics-client-ca\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.214510 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.214490 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-telemeter-client-tls\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.215045 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.215018 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.215130 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.215109 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-secret-telemeter-client\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.215377 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.215360 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-federate-client-tls\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.229359 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.229311 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n774x\" (UniqueName: \"kubernetes.io/projected/4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d-kube-api-access-n774x\") pod \"telemeter-client-f75fb554d-q7kkf\" (UID: \"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d\") " pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.267217 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.267040 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dj6nz"] Apr 24 16:43:51.270798 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:43:51.270768 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf7a70b6_9c3d_42f2_912b_ae46ee6adb21.slice/crio-ed0efa3d379bb6bb03b5c5b0c8ece129f5da165157212e6f26dcb822d969511c WatchSource:0}: Error finding container ed0efa3d379bb6bb03b5c5b0c8ece129f5da165157212e6f26dcb822d969511c: Status 404 returned error can't find the container with id ed0efa3d379bb6bb03b5c5b0c8ece129f5da165157212e6f26dcb822d969511c Apr 24 16:43:51.278459 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.278434 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zgxq7"] Apr 24 16:43:51.280589 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:43:51.280562 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod444f46c8_3c9a_4e72_8000_ca142ae511ef.slice/crio-f1e3e9c78622a1bff81bf9c8e0904d8027aad8a98c887a92bf1994a2f6dede23 WatchSource:0}: Error finding container f1e3e9c78622a1bff81bf9c8e0904d8027aad8a98c887a92bf1994a2f6dede23: Status 404 returned error can't find the container with id f1e3e9c78622a1bff81bf9c8e0904d8027aad8a98c887a92bf1994a2f6dede23 Apr 24 16:43:51.298254 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.298226 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56d5dcfb77-vmzmz"] Apr 24 16:43:51.300556 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:43:51.300528 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8f243a6_3111_4778_9ca9_092db5973836.slice/crio-5dfd8a9c45ec056171c1c6e94c79eeaa4cc4c06c5cf489e7940c13022097a3db WatchSource:0}: Error finding container 5dfd8a9c45ec056171c1c6e94c79eeaa4cc4c06c5cf489e7940c13022097a3db: Status 404 returned error can't find the container with id 5dfd8a9c45ec056171c1c6e94c79eeaa4cc4c06c5cf489e7940c13022097a3db Apr 24 16:43:51.338404 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.338379 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" Apr 24 16:43:51.472077 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:51.472051 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-f75fb554d-q7kkf"] Apr 24 16:43:51.474193 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:43:51.474162 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f4c333b_4338_4c1a_aeeb_3c2a3a5d125d.slice/crio-6c11633a17c5172f31d23be1c71a668fae4672b9751256e07f813bc3a0379beb WatchSource:0}: Error finding container 6c11633a17c5172f31d23be1c71a668fae4672b9751256e07f813bc3a0379beb: Status 404 returned error can't find the container with id 6c11633a17c5172f31d23be1c71a668fae4672b9751256e07f813bc3a0379beb Apr 24 16:43:52.110299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:52.110247 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zgxq7" event={"ID":"444f46c8-3c9a-4e72-8000-ca142ae511ef","Type":"ContainerStarted","Data":"f1e3e9c78622a1bff81bf9c8e0904d8027aad8a98c887a92bf1994a2f6dede23"} Apr 24 16:43:52.112107 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:52.112072 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" event={"ID":"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d","Type":"ContainerStarted","Data":"6c11633a17c5172f31d23be1c71a668fae4672b9751256e07f813bc3a0379beb"} Apr 24 16:43:52.113344 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:52.113319 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dj6nz" event={"ID":"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21","Type":"ContainerStarted","Data":"ed0efa3d379bb6bb03b5c5b0c8ece129f5da165157212e6f26dcb822d969511c"} Apr 24 16:43:52.115060 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:52.115016 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" event={"ID":"e8f243a6-3111-4778-9ca9-092db5973836","Type":"ContainerStarted","Data":"deaaad048d01d60f868c02e9a398d65cfa52cff4a477c3dc670cf648a5163a64"} Apr 24 16:43:52.115060 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:52.115041 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" event={"ID":"e8f243a6-3111-4778-9ca9-092db5973836","Type":"ContainerStarted","Data":"5dfd8a9c45ec056171c1c6e94c79eeaa4cc4c06c5cf489e7940c13022097a3db"} Apr 24 16:43:52.115246 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:52.115163 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:43:52.139168 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:52.139081 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" podStartSLOduration=264.139061642 podStartE2EDuration="4m24.139061642s" podCreationTimestamp="2026-04-24 16:39:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:43:52.135896196 +0000 UTC m=+284.593210069" watchObservedRunningTime="2026-04-24 16:43:52.139061642 +0000 UTC m=+284.596375514" Apr 24 16:43:54.124462 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:54.124429 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" event={"ID":"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d","Type":"ContainerStarted","Data":"308222da126ae43183298364be7d9f14bb5bac76c1ef374de33f8fce9feaa5f8"} Apr 24 16:43:54.124959 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:54.124469 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" event={"ID":"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d","Type":"ContainerStarted","Data":"73d09d19422250ce01c18ec04e0ee73e394969c613a0a13dadead98e7fc5dd88"} Apr 24 16:43:54.124959 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:54.124484 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" event={"ID":"4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d","Type":"ContainerStarted","Data":"91d07ee170b893ce7f5cadd79dd8257872be0395fc7251e48e9f2db263bd4c0b"} Apr 24 16:43:54.126187 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:54.126159 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dj6nz" event={"ID":"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21","Type":"ContainerStarted","Data":"374d7cfe3cd3b07eb364b6d09cc7841872fe1c721bea6e37d151407eba1d18a6"} Apr 24 16:43:54.126300 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:54.126193 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dj6nz" event={"ID":"bf7a70b6-9c3d-42f2-912b-ae46ee6adb21","Type":"ContainerStarted","Data":"6b0465d215504f62fc05ef7033330733e678e6a4255b946a7916c2922a368c35"} Apr 24 16:43:54.126300 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:54.126212 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-dj6nz" Apr 24 16:43:54.127371 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:54.127351 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zgxq7" event={"ID":"444f46c8-3c9a-4e72-8000-ca142ae511ef","Type":"ContainerStarted","Data":"30463678ba0cd2f4daf5061a9b19898bedc64cb17ec4d9882adac5ce5b119b46"} Apr 24 16:43:54.151596 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:54.151545 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-f75fb554d-q7kkf" podStartSLOduration=1.983117892 podStartE2EDuration="4.151531797s" podCreationTimestamp="2026-04-24 16:43:50 +0000 UTC" firstStartedPulling="2026-04-24 16:43:51.476071612 +0000 UTC m=+283.933385464" lastFinishedPulling="2026-04-24 16:43:53.644485518 +0000 UTC m=+286.101799369" observedRunningTime="2026-04-24 16:43:54.150161416 +0000 UTC m=+286.607475288" watchObservedRunningTime="2026-04-24 16:43:54.151531797 +0000 UTC m=+286.608845667" Apr 24 16:43:54.167591 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:54.167534 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dj6nz" podStartSLOduration=251.798165393 podStartE2EDuration="4m14.167518822s" podCreationTimestamp="2026-04-24 16:39:40 +0000 UTC" firstStartedPulling="2026-04-24 16:43:51.27317713 +0000 UTC m=+283.730490980" lastFinishedPulling="2026-04-24 16:43:53.642530547 +0000 UTC m=+286.099844409" observedRunningTime="2026-04-24 16:43:54.165757348 +0000 UTC m=+286.623071222" watchObservedRunningTime="2026-04-24 16:43:54.167518822 +0000 UTC m=+286.624832693" Apr 24 16:43:54.183803 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:43:54.183747 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zgxq7" podStartSLOduration=251.818477261 podStartE2EDuration="4m14.183732117s" podCreationTimestamp="2026-04-24 16:39:40 +0000 UTC" firstStartedPulling="2026-04-24 16:43:51.282495754 +0000 UTC m=+283.739809606" lastFinishedPulling="2026-04-24 16:43:53.647750599 +0000 UTC m=+286.105064462" observedRunningTime="2026-04-24 16:43:54.181794616 +0000 UTC m=+286.639108489" watchObservedRunningTime="2026-04-24 16:43:54.183732117 +0000 UTC m=+286.641045988" Apr 24 16:44:01.155867 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:01.155783 2562 generic.go:358] "Generic (PLEG): container finished" podID="5e812eb2-480c-4a84-b382-d7c26bd0da17" containerID="41883531f55cb97b3f74d173839eb59860d72c506e6d75d325f66212b6354e97" exitCode=255 Apr 24 16:44:01.156313 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:01.155858 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6" event={"ID":"5e812eb2-480c-4a84-b382-d7c26bd0da17","Type":"ContainerDied","Data":"41883531f55cb97b3f74d173839eb59860d72c506e6d75d325f66212b6354e97"} Apr 24 16:44:01.156313 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:01.155900 2562 scope.go:117] "RemoveContainer" containerID="75344998c22843bc3d0cd65de3c82e582673116e7c5fd3b97a1d05e1e4fdf0ea" Apr 24 16:44:01.163291 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:01.163270 2562 scope.go:117] "RemoveContainer" containerID="41883531f55cb97b3f74d173839eb59860d72c506e6d75d325f66212b6354e97" Apr 24 16:44:01.163481 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:44:01.163463 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"addon-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=addon-agent pod=managed-serviceaccount-addon-agent-667fffb569-r7fw6_open-cluster-management-agent-addon(5e812eb2-480c-4a84-b382-d7c26bd0da17)\"" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6" podUID="5e812eb2-480c-4a84-b382-d7c26bd0da17" Apr 24 16:44:04.132871 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:04.132844 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dj6nz" Apr 24 16:44:08.057333 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:08.057296 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-99ws4_7235e10c-761b-4d2d-a4f9-2d8114898c5d/console-operator/1.log" Apr 24 16:44:08.057794 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:08.057694 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-99ws4_7235e10c-761b-4d2d-a4f9-2d8114898c5d/console-operator/1.log" Apr 24 16:44:08.068867 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:08.068844 2562 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 16:44:08.074447 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:08.074424 2562 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6" Apr 24 16:44:08.074804 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:08.074789 2562 scope.go:117] "RemoveContainer" containerID="41883531f55cb97b3f74d173839eb59860d72c506e6d75d325f66212b6354e97" Apr 24 16:44:08.075043 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:44:08.075021 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"addon-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=addon-agent pod=managed-serviceaccount-addon-agent-667fffb569-r7fw6_open-cluster-management-agent-addon(5e812eb2-480c-4a84-b382-d7c26bd0da17)\"" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6" podUID="5e812eb2-480c-4a84-b382-d7c26bd0da17" Apr 24 16:44:11.110128 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:11.110093 2562 patch_prober.go:28] interesting pod/image-registry-56d5dcfb77-vmzmz container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:44:11.112526 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:11.110173 2562 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" podUID="e8f243a6-3111-4778-9ca9-092db5973836" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:44:13.122800 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:13.122764 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:44:21.179703 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:21.179674 2562 scope.go:117] "RemoveContainer" containerID="41883531f55cb97b3f74d173839eb59860d72c506e6d75d325f66212b6354e97" Apr 24 16:44:21.180665 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:21.180648 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:44:22.233695 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:22.233660 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-667fffb569-r7fw6" event={"ID":"5e812eb2-480c-4a84-b382-d7c26bd0da17","Type":"ContainerStarted","Data":"4615fb70c02cfb834635f43e7612436798a5680871fe997c106b541382c3c1db"} Apr 24 16:44:33.617043 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:33.617006 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56d5dcfb77-vmzmz"] Apr 24 16:44:58.636606 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:58.636491 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" podUID="e8f243a6-3111-4778-9ca9-092db5973836" containerName="registry" containerID="cri-o://deaaad048d01d60f868c02e9a398d65cfa52cff4a477c3dc670cf648a5163a64" gracePeriod=30 Apr 24 16:44:59.887001 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:44:59.886969 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:45:00.033047 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.033016 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e8f243a6-3111-4778-9ca9-092db5973836-installation-pull-secrets\") pod \"e8f243a6-3111-4778-9ca9-092db5973836\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " Apr 24 16:45:00.033242 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.033101 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e8f243a6-3111-4778-9ca9-092db5973836-image-registry-private-configuration\") pod \"e8f243a6-3111-4778-9ca9-092db5973836\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " Apr 24 16:45:00.033242 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.033125 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6pcs\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-kube-api-access-n6pcs\") pod \"e8f243a6-3111-4778-9ca9-092db5973836\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " Apr 24 16:45:00.033242 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.033155 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-bound-sa-token\") pod \"e8f243a6-3111-4778-9ca9-092db5973836\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " Apr 24 16:45:00.033242 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.033176 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e8f243a6-3111-4778-9ca9-092db5973836-trusted-ca\") pod \"e8f243a6-3111-4778-9ca9-092db5973836\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " Apr 24 16:45:00.033242 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.033200 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e8f243a6-3111-4778-9ca9-092db5973836-ca-trust-extracted\") pod \"e8f243a6-3111-4778-9ca9-092db5973836\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " Apr 24 16:45:00.033242 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.033240 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e8f243a6-3111-4778-9ca9-092db5973836-registry-certificates\") pod \"e8f243a6-3111-4778-9ca9-092db5973836\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " Apr 24 16:45:00.033566 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.033286 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls\") pod \"e8f243a6-3111-4778-9ca9-092db5973836\" (UID: \"e8f243a6-3111-4778-9ca9-092db5973836\") " Apr 24 16:45:00.033735 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.033681 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8f243a6-3111-4778-9ca9-092db5973836-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e8f243a6-3111-4778-9ca9-092db5973836" (UID: "e8f243a6-3111-4778-9ca9-092db5973836"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:45:00.034054 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.034009 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8f243a6-3111-4778-9ca9-092db5973836-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e8f243a6-3111-4778-9ca9-092db5973836" (UID: "e8f243a6-3111-4778-9ca9-092db5973836"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:45:00.035527 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.035494 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8f243a6-3111-4778-9ca9-092db5973836-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e8f243a6-3111-4778-9ca9-092db5973836" (UID: "e8f243a6-3111-4778-9ca9-092db5973836"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:45:00.035625 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.035568 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8f243a6-3111-4778-9ca9-092db5973836-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e8f243a6-3111-4778-9ca9-092db5973836" (UID: "e8f243a6-3111-4778-9ca9-092db5973836"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:45:00.035731 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.035715 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-kube-api-access-n6pcs" (OuterVolumeSpecName: "kube-api-access-n6pcs") pod "e8f243a6-3111-4778-9ca9-092db5973836" (UID: "e8f243a6-3111-4778-9ca9-092db5973836"). InnerVolumeSpecName "kube-api-access-n6pcs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:45:00.035856 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.035840 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e8f243a6-3111-4778-9ca9-092db5973836" (UID: "e8f243a6-3111-4778-9ca9-092db5973836"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:45:00.035912 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.035867 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e8f243a6-3111-4778-9ca9-092db5973836" (UID: "e8f243a6-3111-4778-9ca9-092db5973836"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:45:00.041706 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.041678 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8f243a6-3111-4778-9ca9-092db5973836-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e8f243a6-3111-4778-9ca9-092db5973836" (UID: "e8f243a6-3111-4778-9ca9-092db5973836"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:45:00.134375 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.134331 2562 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-registry-tls\") on node \"ip-10-0-143-144.ec2.internal\" DevicePath \"\"" Apr 24 16:45:00.134375 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.134366 2562 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e8f243a6-3111-4778-9ca9-092db5973836-installation-pull-secrets\") on node \"ip-10-0-143-144.ec2.internal\" DevicePath \"\"" Apr 24 16:45:00.134375 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.134377 2562 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e8f243a6-3111-4778-9ca9-092db5973836-image-registry-private-configuration\") on node \"ip-10-0-143-144.ec2.internal\" DevicePath \"\"" Apr 24 16:45:00.134605 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.134388 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n6pcs\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-kube-api-access-n6pcs\") on node \"ip-10-0-143-144.ec2.internal\" DevicePath \"\"" Apr 24 16:45:00.134605 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.134397 2562 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e8f243a6-3111-4778-9ca9-092db5973836-bound-sa-token\") on node \"ip-10-0-143-144.ec2.internal\" DevicePath \"\"" Apr 24 16:45:00.134605 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.134406 2562 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e8f243a6-3111-4778-9ca9-092db5973836-trusted-ca\") on node \"ip-10-0-143-144.ec2.internal\" DevicePath \"\"" Apr 24 16:45:00.134605 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.134414 2562 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e8f243a6-3111-4778-9ca9-092db5973836-ca-trust-extracted\") on node \"ip-10-0-143-144.ec2.internal\" DevicePath \"\"" Apr 24 16:45:00.134605 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.134425 2562 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e8f243a6-3111-4778-9ca9-092db5973836-registry-certificates\") on node \"ip-10-0-143-144.ec2.internal\" DevicePath \"\"" Apr 24 16:45:00.359698 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.359603 2562 generic.go:358] "Generic (PLEG): container finished" podID="e8f243a6-3111-4778-9ca9-092db5973836" containerID="deaaad048d01d60f868c02e9a398d65cfa52cff4a477c3dc670cf648a5163a64" exitCode=0 Apr 24 16:45:00.359698 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.359662 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" Apr 24 16:45:00.359698 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.359682 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" event={"ID":"e8f243a6-3111-4778-9ca9-092db5973836","Type":"ContainerDied","Data":"deaaad048d01d60f868c02e9a398d65cfa52cff4a477c3dc670cf648a5163a64"} Apr 24 16:45:00.359924 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.359713 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56d5dcfb77-vmzmz" event={"ID":"e8f243a6-3111-4778-9ca9-092db5973836","Type":"ContainerDied","Data":"5dfd8a9c45ec056171c1c6e94c79eeaa4cc4c06c5cf489e7940c13022097a3db"} Apr 24 16:45:00.359924 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.359730 2562 scope.go:117] "RemoveContainer" containerID="deaaad048d01d60f868c02e9a398d65cfa52cff4a477c3dc670cf648a5163a64" Apr 24 16:45:00.368055 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.368035 2562 scope.go:117] "RemoveContainer" containerID="deaaad048d01d60f868c02e9a398d65cfa52cff4a477c3dc670cf648a5163a64" Apr 24 16:45:00.368310 ip-10-0-143-144 kubenswrapper[2562]: E0424 16:45:00.368294 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deaaad048d01d60f868c02e9a398d65cfa52cff4a477c3dc670cf648a5163a64\": container with ID starting with deaaad048d01d60f868c02e9a398d65cfa52cff4a477c3dc670cf648a5163a64 not found: ID does not exist" containerID="deaaad048d01d60f868c02e9a398d65cfa52cff4a477c3dc670cf648a5163a64" Apr 24 16:45:00.368351 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.368317 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deaaad048d01d60f868c02e9a398d65cfa52cff4a477c3dc670cf648a5163a64"} err="failed to get container status \"deaaad048d01d60f868c02e9a398d65cfa52cff4a477c3dc670cf648a5163a64\": rpc error: code = NotFound desc = could not find container \"deaaad048d01d60f868c02e9a398d65cfa52cff4a477c3dc670cf648a5163a64\": container with ID starting with deaaad048d01d60f868c02e9a398d65cfa52cff4a477c3dc670cf648a5163a64 not found: ID does not exist" Apr 24 16:45:00.376719 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.376696 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56d5dcfb77-vmzmz"] Apr 24 16:45:00.379827 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:00.379796 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-56d5dcfb77-vmzmz"] Apr 24 16:45:02.177491 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:02.177446 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8f243a6-3111-4778-9ca9-092db5973836" path="/var/lib/kubelet/pods/e8f243a6-3111-4778-9ca9-092db5973836/volumes" Apr 24 16:45:30.607356 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:30.607316 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-4qjtn"] Apr 24 16:45:30.607994 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:30.607843 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8f243a6-3111-4778-9ca9-092db5973836" containerName="registry" Apr 24 16:45:30.607994 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:30.607865 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f243a6-3111-4778-9ca9-092db5973836" containerName="registry" Apr 24 16:45:30.608135 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:30.607998 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8f243a6-3111-4778-9ca9-092db5973836" containerName="registry" Apr 24 16:45:30.611068 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:30.611045 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4qjtn" Apr 24 16:45:30.614726 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:30.614706 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 16:45:30.623309 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:30.623285 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4qjtn"] Apr 24 16:45:30.694808 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:30.694774 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/256d1621-cfc7-4636-a8a5-ec32477c8abc-original-pull-secret\") pod \"global-pull-secret-syncer-4qjtn\" (UID: \"256d1621-cfc7-4636-a8a5-ec32477c8abc\") " pod="kube-system/global-pull-secret-syncer-4qjtn" Apr 24 16:45:30.695027 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:30.694847 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/256d1621-cfc7-4636-a8a5-ec32477c8abc-dbus\") pod \"global-pull-secret-syncer-4qjtn\" (UID: \"256d1621-cfc7-4636-a8a5-ec32477c8abc\") " pod="kube-system/global-pull-secret-syncer-4qjtn" Apr 24 16:45:30.695027 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:30.694915 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/256d1621-cfc7-4636-a8a5-ec32477c8abc-kubelet-config\") pod \"global-pull-secret-syncer-4qjtn\" (UID: \"256d1621-cfc7-4636-a8a5-ec32477c8abc\") " pod="kube-system/global-pull-secret-syncer-4qjtn" Apr 24 16:45:30.796722 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:30.796682 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/256d1621-cfc7-4636-a8a5-ec32477c8abc-dbus\") pod \"global-pull-secret-syncer-4qjtn\" (UID: \"256d1621-cfc7-4636-a8a5-ec32477c8abc\") " pod="kube-system/global-pull-secret-syncer-4qjtn" Apr 24 16:45:30.796722 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:30.796727 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/256d1621-cfc7-4636-a8a5-ec32477c8abc-kubelet-config\") pod \"global-pull-secret-syncer-4qjtn\" (UID: \"256d1621-cfc7-4636-a8a5-ec32477c8abc\") " pod="kube-system/global-pull-secret-syncer-4qjtn" Apr 24 16:45:30.796922 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:30.796760 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/256d1621-cfc7-4636-a8a5-ec32477c8abc-original-pull-secret\") pod \"global-pull-secret-syncer-4qjtn\" (UID: \"256d1621-cfc7-4636-a8a5-ec32477c8abc\") " pod="kube-system/global-pull-secret-syncer-4qjtn" Apr 24 16:45:30.796922 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:30.796881 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/256d1621-cfc7-4636-a8a5-ec32477c8abc-dbus\") pod \"global-pull-secret-syncer-4qjtn\" (UID: \"256d1621-cfc7-4636-a8a5-ec32477c8abc\") " pod="kube-system/global-pull-secret-syncer-4qjtn" Apr 24 16:45:30.796922 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:30.796881 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/256d1621-cfc7-4636-a8a5-ec32477c8abc-kubelet-config\") pod \"global-pull-secret-syncer-4qjtn\" (UID: \"256d1621-cfc7-4636-a8a5-ec32477c8abc\") " pod="kube-system/global-pull-secret-syncer-4qjtn" Apr 24 16:45:30.799007 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:30.798989 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/256d1621-cfc7-4636-a8a5-ec32477c8abc-original-pull-secret\") pod \"global-pull-secret-syncer-4qjtn\" (UID: \"256d1621-cfc7-4636-a8a5-ec32477c8abc\") " pod="kube-system/global-pull-secret-syncer-4qjtn" Apr 24 16:45:30.920592 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:30.920497 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4qjtn" Apr 24 16:45:31.047716 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:31.047691 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4qjtn"] Apr 24 16:45:31.050262 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:45:31.050235 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod256d1621_cfc7_4636_a8a5_ec32477c8abc.slice/crio-caf5fd292c2b23d4bf459b5b4ac3f6db16814559dc4d60e69c2c6296569e06b4 WatchSource:0}: Error finding container caf5fd292c2b23d4bf459b5b4ac3f6db16814559dc4d60e69c2c6296569e06b4: Status 404 returned error can't find the container with id caf5fd292c2b23d4bf459b5b4ac3f6db16814559dc4d60e69c2c6296569e06b4 Apr 24 16:45:31.459860 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:31.459822 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4qjtn" event={"ID":"256d1621-cfc7-4636-a8a5-ec32477c8abc","Type":"ContainerStarted","Data":"caf5fd292c2b23d4bf459b5b4ac3f6db16814559dc4d60e69c2c6296569e06b4"} Apr 24 16:45:35.476465 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:35.476419 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4qjtn" event={"ID":"256d1621-cfc7-4636-a8a5-ec32477c8abc","Type":"ContainerStarted","Data":"489384f2a232a730411f7352647ebe1b86b3fe5abbf1a9dfd23b4c9a4b991284"} Apr 24 16:45:35.495947 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:45:35.495864 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-4qjtn" podStartSLOduration=1.257727764 podStartE2EDuration="5.495844443s" podCreationTimestamp="2026-04-24 16:45:30 +0000 UTC" firstStartedPulling="2026-04-24 16:45:31.052164879 +0000 UTC m=+383.509478729" lastFinishedPulling="2026-04-24 16:45:35.290281553 +0000 UTC m=+387.747595408" observedRunningTime="2026-04-24 16:45:35.493786668 +0000 UTC m=+387.951100541" watchObservedRunningTime="2026-04-24 16:45:35.495844443 +0000 UTC m=+387.953158315" Apr 24 16:49:08.093107 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:49:08.093080 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-99ws4_7235e10c-761b-4d2d-a4f9-2d8114898c5d/console-operator/1.log" Apr 24 16:49:08.094128 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:49:08.094105 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-99ws4_7235e10c-761b-4d2d-a4f9-2d8114898c5d/console-operator/1.log" Apr 24 16:54:08.117772 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:08.117693 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-99ws4_7235e10c-761b-4d2d-a4f9-2d8114898c5d/console-operator/1.log" Apr 24 16:54:08.120817 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:08.120791 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-99ws4_7235e10c-761b-4d2d-a4f9-2d8114898c5d/console-operator/1.log" Apr 24 16:54:31.403499 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:31.403463 2562 ???:1] "http2: server: error reading preface from client 10.0.129.204:36094: read tcp 10.0.143.144:10250->10.0.129.204:36094: read: connection reset by peer" Apr 24 16:54:31.407315 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:31.407292 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-4qjtn_256d1621-cfc7-4636-a8a5-ec32477c8abc/global-pull-secret-syncer/0.log" Apr 24 16:54:31.582017 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:31.581987 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-t5dcg_6db26d97-eaa5-4cf6-9b9c-a4d322db5952/konnectivity-agent/0.log" Apr 24 16:54:31.689887 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:31.689797 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-144.ec2.internal_0475c0131c709c68e4c47751862dac8e/haproxy/0.log" Apr 24 16:54:34.926734 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:34.926706 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7427106d-6c5c-457d-a96f-8d79db7264aa/alertmanager/0.log" Apr 24 16:54:34.951063 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:34.951041 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7427106d-6c5c-457d-a96f-8d79db7264aa/config-reloader/0.log" Apr 24 16:54:34.978794 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:34.978753 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7427106d-6c5c-457d-a96f-8d79db7264aa/kube-rbac-proxy-web/0.log" Apr 24 16:54:34.999165 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:34.999140 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7427106d-6c5c-457d-a96f-8d79db7264aa/kube-rbac-proxy/0.log" Apr 24 16:54:35.025674 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.025646 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7427106d-6c5c-457d-a96f-8d79db7264aa/kube-rbac-proxy-metric/0.log" Apr 24 16:54:35.048790 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.048762 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7427106d-6c5c-457d-a96f-8d79db7264aa/prom-label-proxy/0.log" Apr 24 16:54:35.071624 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.071601 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7427106d-6c5c-457d-a96f-8d79db7264aa/init-config-reloader/0.log" Apr 24 16:54:35.106283 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.106255 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-n5n4w_cc0177b3-ada2-4478-8a4f-354de52d1414/cluster-monitoring-operator/0.log" Apr 24 16:54:35.129048 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.129023 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-s6tzg_c67686de-860d-4144-a49a-f0d703568add/kube-state-metrics/0.log" Apr 24 16:54:35.154551 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.154527 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-s6tzg_c67686de-860d-4144-a49a-f0d703568add/kube-rbac-proxy-main/0.log" Apr 24 16:54:35.175047 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.175018 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-s6tzg_c67686de-860d-4144-a49a-f0d703568add/kube-rbac-proxy-self/0.log" Apr 24 16:54:35.208145 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.208083 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-86bb488774-stnw5_83beecc5-9867-43c6-8b53-c7cd037c42b1/metrics-server/0.log" Apr 24 16:54:35.234251 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.234228 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-xrwb9_4670a4ea-1ec1-494f-9df8-e887515f6638/monitoring-plugin/0.log" Apr 24 16:54:35.418160 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.418135 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lq8jc_f15e46f0-058b-4e83-8863-31e18b978144/node-exporter/0.log" Apr 24 16:54:35.441258 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.441236 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lq8jc_f15e46f0-058b-4e83-8863-31e18b978144/kube-rbac-proxy/0.log" Apr 24 16:54:35.462581 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.462507 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lq8jc_f15e46f0-058b-4e83-8863-31e18b978144/init-textfile/0.log" Apr 24 16:54:35.572424 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.572395 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5a2a3f3b-e44b-4700-b5ff-26ebc37982d3/prometheus/0.log" Apr 24 16:54:35.596432 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.596402 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5a2a3f3b-e44b-4700-b5ff-26ebc37982d3/config-reloader/0.log" Apr 24 16:54:35.618576 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.618553 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5a2a3f3b-e44b-4700-b5ff-26ebc37982d3/thanos-sidecar/0.log" Apr 24 16:54:35.640849 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.640826 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5a2a3f3b-e44b-4700-b5ff-26ebc37982d3/kube-rbac-proxy-web/0.log" Apr 24 16:54:35.669655 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.669594 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5a2a3f3b-e44b-4700-b5ff-26ebc37982d3/kube-rbac-proxy/0.log" Apr 24 16:54:35.696636 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.696588 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5a2a3f3b-e44b-4700-b5ff-26ebc37982d3/kube-rbac-proxy-thanos/0.log" Apr 24 16:54:35.725774 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.725710 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5a2a3f3b-e44b-4700-b5ff-26ebc37982d3/init-config-reloader/0.log" Apr 24 16:54:35.806046 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.805989 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-hx5th_da0109d3-7192-4ad8-a35a-b4c4e4438cbe/prometheus-operator-admission-webhook/0.log" Apr 24 16:54:35.837171 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.837134 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f75fb554d-q7kkf_4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d/telemeter-client/0.log" Apr 24 16:54:35.861175 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.861153 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f75fb554d-q7kkf_4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d/reload/0.log" Apr 24 16:54:35.884721 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:35.884700 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-f75fb554d-q7kkf_4f4c333b-4338-4c1a-aeeb-3c2a3a5d125d/kube-rbac-proxy/0.log" Apr 24 16:54:37.586394 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:37.586368 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-99ws4_7235e10c-761b-4d2d-a4f9-2d8114898c5d/console-operator/1.log" Apr 24 16:54:37.592117 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:37.592098 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-99ws4_7235e10c-761b-4d2d-a4f9-2d8114898c5d/console-operator/2.log" Apr 24 16:54:38.418272 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.418241 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-stqch_3371be78-94e6-4a3b-98b2-9aaad783afa5/volume-data-source-validator/0.log" Apr 24 16:54:38.592828 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.592792 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb"] Apr 24 16:54:38.596187 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.596170 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:38.599580 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.599558 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wz7wm\"/\"kube-root-ca.crt\"" Apr 24 16:54:38.599665 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.599608 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wz7wm\"/\"openshift-service-ca.crt\"" Apr 24 16:54:38.599665 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.599565 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wz7wm\"/\"default-dockercfg-wj4t8\"" Apr 24 16:54:38.605364 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.605341 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb"] Apr 24 16:54:38.643747 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.643711 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4cae0c3c-6f57-412f-a400-62e074b8a7cd-proc\") pod \"perf-node-gather-daemonset-l4xqb\" (UID: \"4cae0c3c-6f57-412f-a400-62e074b8a7cd\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:38.643915 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.643763 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pmqq\" (UniqueName: \"kubernetes.io/projected/4cae0c3c-6f57-412f-a400-62e074b8a7cd-kube-api-access-8pmqq\") pod \"perf-node-gather-daemonset-l4xqb\" (UID: \"4cae0c3c-6f57-412f-a400-62e074b8a7cd\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:38.643915 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.643830 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4cae0c3c-6f57-412f-a400-62e074b8a7cd-sys\") pod \"perf-node-gather-daemonset-l4xqb\" (UID: \"4cae0c3c-6f57-412f-a400-62e074b8a7cd\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:38.643915 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.643847 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4cae0c3c-6f57-412f-a400-62e074b8a7cd-podres\") pod \"perf-node-gather-daemonset-l4xqb\" (UID: \"4cae0c3c-6f57-412f-a400-62e074b8a7cd\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:38.643915 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.643870 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4cae0c3c-6f57-412f-a400-62e074b8a7cd-lib-modules\") pod \"perf-node-gather-daemonset-l4xqb\" (UID: \"4cae0c3c-6f57-412f-a400-62e074b8a7cd\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:38.745098 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.745066 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pmqq\" (UniqueName: \"kubernetes.io/projected/4cae0c3c-6f57-412f-a400-62e074b8a7cd-kube-api-access-8pmqq\") pod \"perf-node-gather-daemonset-l4xqb\" (UID: \"4cae0c3c-6f57-412f-a400-62e074b8a7cd\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:38.745299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.745120 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4cae0c3c-6f57-412f-a400-62e074b8a7cd-sys\") pod \"perf-node-gather-daemonset-l4xqb\" (UID: \"4cae0c3c-6f57-412f-a400-62e074b8a7cd\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:38.745299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.745140 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4cae0c3c-6f57-412f-a400-62e074b8a7cd-podres\") pod \"perf-node-gather-daemonset-l4xqb\" (UID: \"4cae0c3c-6f57-412f-a400-62e074b8a7cd\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:38.745299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.745168 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4cae0c3c-6f57-412f-a400-62e074b8a7cd-lib-modules\") pod \"perf-node-gather-daemonset-l4xqb\" (UID: \"4cae0c3c-6f57-412f-a400-62e074b8a7cd\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:38.745299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.745215 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4cae0c3c-6f57-412f-a400-62e074b8a7cd-proc\") pod \"perf-node-gather-daemonset-l4xqb\" (UID: \"4cae0c3c-6f57-412f-a400-62e074b8a7cd\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:38.745299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.745262 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4cae0c3c-6f57-412f-a400-62e074b8a7cd-sys\") pod \"perf-node-gather-daemonset-l4xqb\" (UID: \"4cae0c3c-6f57-412f-a400-62e074b8a7cd\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:38.745299 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.745296 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4cae0c3c-6f57-412f-a400-62e074b8a7cd-proc\") pod \"perf-node-gather-daemonset-l4xqb\" (UID: \"4cae0c3c-6f57-412f-a400-62e074b8a7cd\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:38.745509 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.745325 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4cae0c3c-6f57-412f-a400-62e074b8a7cd-podres\") pod \"perf-node-gather-daemonset-l4xqb\" (UID: \"4cae0c3c-6f57-412f-a400-62e074b8a7cd\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:38.745509 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.745335 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4cae0c3c-6f57-412f-a400-62e074b8a7cd-lib-modules\") pod \"perf-node-gather-daemonset-l4xqb\" (UID: \"4cae0c3c-6f57-412f-a400-62e074b8a7cd\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:38.752876 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.752846 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pmqq\" (UniqueName: \"kubernetes.io/projected/4cae0c3c-6f57-412f-a400-62e074b8a7cd-kube-api-access-8pmqq\") pod \"perf-node-gather-daemonset-l4xqb\" (UID: \"4cae0c3c-6f57-412f-a400-62e074b8a7cd\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:38.907147 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:38.907111 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:39.031632 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:39.031587 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb"] Apr 24 16:54:39.034101 ip-10-0-143-144 kubenswrapper[2562]: W0424 16:54:39.034071 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4cae0c3c_6f57_412f_a400_62e074b8a7cd.slice/crio-750cef3902e712f211ee690f0d2dc97b27cdf9563c9dedc7136bfeac04e1825c WatchSource:0}: Error finding container 750cef3902e712f211ee690f0d2dc97b27cdf9563c9dedc7136bfeac04e1825c: Status 404 returned error can't find the container with id 750cef3902e712f211ee690f0d2dc97b27cdf9563c9dedc7136bfeac04e1825c Apr 24 16:54:39.035832 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:39.035812 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:54:39.184557 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:39.184529 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dj6nz_bf7a70b6-9c3d-42f2-912b-ae46ee6adb21/dns/0.log" Apr 24 16:54:39.204572 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:39.204545 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dj6nz_bf7a70b6-9c3d-42f2-912b-ae46ee6adb21/kube-rbac-proxy/0.log" Apr 24 16:54:39.272507 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:39.272480 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m9vmw_9a9da436-211c-45bd-9f7b-51e5eea9f69e/dns-node-resolver/0.log" Apr 24 16:54:39.351776 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:39.351684 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" event={"ID":"4cae0c3c-6f57-412f-a400-62e074b8a7cd","Type":"ContainerStarted","Data":"f5df70819eb4a2062d7429bc22f2d0164424013be7303e8146401d2946d6c3b6"} Apr 24 16:54:39.351776 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:39.351719 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" event={"ID":"4cae0c3c-6f57-412f-a400-62e074b8a7cd","Type":"ContainerStarted","Data":"750cef3902e712f211ee690f0d2dc97b27cdf9563c9dedc7136bfeac04e1825c"} Apr 24 16:54:39.351776 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:39.351734 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:39.367829 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:39.367776 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" podStartSLOduration=1.3677611170000001 podStartE2EDuration="1.367761117s" podCreationTimestamp="2026-04-24 16:54:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:54:39.366110871 +0000 UTC m=+931.823424742" watchObservedRunningTime="2026-04-24 16:54:39.367761117 +0000 UTC m=+931.825074988" Apr 24 16:54:45.365414 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:45.365385 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-l4xqb" Apr 24 16:54:53.085659 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:53.085626 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-w297f_e5357c94-3bc4-4825-808b-68599eb79e96/node-ca/0.log" Apr 24 16:54:53.945487 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:53.945460 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-d8c574-57dx2_04503171-24fb-471c-9dcc-be6c1d1b3331/router/0.log" Apr 24 16:54:54.399925 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:54.399898 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zgxq7_444f46c8-3c9a-4e72-8000-ca142ae511ef/serve-healthcheck-canary/0.log" Apr 24 16:54:54.850490 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:54.850458 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-jltdt_bc664fe7-8e82-414a-88b1-9faec08dd651/insights-operator/0.log" Apr 24 16:54:54.852302 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:54.852282 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-jltdt_bc664fe7-8e82-414a-88b1-9faec08dd651/insights-operator/1.log" Apr 24 16:54:54.873391 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:54.873371 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9rqvf_31adc9d7-5a88-4a54-885a-6d3a7451a4d5/kube-rbac-proxy/0.log" Apr 24 16:54:54.895313 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:54.895292 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9rqvf_31adc9d7-5a88-4a54-885a-6d3a7451a4d5/exporter/0.log" Apr 24 16:54:54.917973 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:54:54.917951 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9rqvf_31adc9d7-5a88-4a54-885a-6d3a7451a4d5/extractor/0.log" Apr 24 16:55:00.641495 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:55:00.641466 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-94ptf_d1584eff-fb10-45b7-979a-deabbfef9402/migrator/0.log" Apr 24 16:55:00.663432 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:55:00.663404 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-94ptf_d1584eff-fb10-45b7-979a-deabbfef9402/graceful-termination/0.log" Apr 24 16:55:01.000108 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:55:01.000078 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6kdn6_e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9/kube-storage-version-migrator-operator/1.log" Apr 24 16:55:01.000813 ip-10-0-143-144 kubenswrapper[2562]: I0424 16:55:01.000795 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6kdn6_e0aba91b-576f-4ced-95dc-ed6bbd9e5cf9/kube-storage-version-migrator-operator/0.log"