Apr 24 21:28:46.012488 ip-10-0-142-162 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:28:46.462414 ip-10-0-142-162 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:28:46.462414 ip-10-0-142-162 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:28:46.462414 ip-10-0-142-162 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:28:46.462949 ip-10-0-142-162 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:28:46.462949 ip-10-0-142-162 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:28:46.464098 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.464029 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:28:46.469997 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.469977 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:28:46.469997 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.469995 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:28:46.469997 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.469999 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:28:46.469997 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470002 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470006 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470009 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470012 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470016 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470020 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470023 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470025 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470028 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470031 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470033 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470036 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470039 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470042 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470044 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470047 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470050 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470053 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470055 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:28:46.470159 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470058 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470061 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470064 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470069 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470073 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470076 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470078 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470081 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470084 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470086 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470089 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470091 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470094 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470097 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470100 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470103 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470105 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470108 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470111 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470113 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:28:46.470641 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470116 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470119 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470122 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470124 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470127 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470129 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470133 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470135 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470138 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470141 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470143 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470146 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470149 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470151 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470154 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470157 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470159 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470162 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470166 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:28:46.471123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470169 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470172 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470175 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470177 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470180 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470183 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470185 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470188 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470190 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470193 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470196 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470198 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470201 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470203 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470206 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470209 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470211 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470215 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470217 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470221 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:28:46.471646 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470236 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470238 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470241 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470244 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470247 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470610 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470622 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470627 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470630 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470635 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470638 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470641 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470645 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470648 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470651 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470654 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470657 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470660 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470662 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:28:46.472133 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470665 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470668 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470671 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470674 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470677 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470681 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470684 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470686 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470689 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470691 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470695 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470697 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470700 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470702 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470705 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470707 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470710 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470713 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470715 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470718 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:28:46.472634 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470721 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470723 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470726 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470728 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470731 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470733 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470736 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470738 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470741 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470743 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470746 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470748 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470750 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470753 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470756 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470759 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470762 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470764 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470767 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:28:46.473132 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470770 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470772 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470775 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470777 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470780 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470783 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470785 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470787 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470790 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470792 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470795 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470798 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470801 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470803 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470806 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470808 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470811 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470814 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470816 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470818 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:28:46.473632 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470821 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470824 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470827 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470830 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470833 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470836 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470838 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470841 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470844 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470846 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470849 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470851 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.470853 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.472928 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.472939 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.472946 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.472950 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.472955 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.472959 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.472963 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.472967 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.472970 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:28:46.474121 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.472973 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.472978 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.472981 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.472984 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.472987 2574 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.472990 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.472993 2574 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.472996 2574 flags.go:64] FLAG: --cloud-config="" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.472998 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473001 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473006 2574 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473009 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473013 2574 flags.go:64] FLAG: --config-dir="" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473015 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473019 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473022 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473025 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473028 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473031 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473035 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473037 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473040 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473044 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473046 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473054 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:28:46.474674 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473057 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473059 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473062 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473066 2574 flags.go:64] FLAG: --enable-server="true" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473069 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473073 2574 flags.go:64] FLAG: --event-burst="100" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473076 2574 flags.go:64] FLAG: --event-qps="50" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473079 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473082 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473086 2574 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473089 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473092 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473095 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473098 2574 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473101 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473104 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473107 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473110 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473113 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473116 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473119 2574 flags.go:64] FLAG: --feature-gates="" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473123 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473126 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473129 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473132 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473136 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:28:46.475285 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473139 2574 flags.go:64] FLAG: --help="false" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473141 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-142-162.ec2.internal" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473145 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473148 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473151 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473154 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473157 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473161 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473163 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473166 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473169 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473172 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473176 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473179 2574 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473182 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473185 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473188 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473191 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473194 2574 flags.go:64] FLAG: --lock-file="" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473197 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473199 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473202 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473208 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:28:46.475906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473211 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473214 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473217 2574 flags.go:64] FLAG: --logging-format="text" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473220 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473233 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473237 2574 flags.go:64] FLAG: --manifest-url="" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473240 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473244 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473247 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473250 2574 flags.go:64] FLAG: --max-pods="110" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473253 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473256 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473259 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473263 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473266 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473269 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473272 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473279 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473282 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473287 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473291 2574 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473294 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473299 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473302 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:28:46.476492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473305 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473307 2574 flags.go:64] FLAG: --port="10250" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473310 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473313 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0d984f374e2a9c862" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473316 2574 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473319 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473322 2574 flags.go:64] FLAG: --register-node="true" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473325 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473328 2574 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473332 2574 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473334 2574 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473337 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473340 2574 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473344 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473347 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473349 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473352 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473355 2574 flags.go:64] FLAG: --runonce="false" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473358 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473361 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473364 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473367 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473370 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473373 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473376 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473379 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:28:46.477065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473381 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473384 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473387 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473390 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473393 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473396 2574 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473399 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473404 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473407 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473410 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473414 2574 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473416 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473419 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473422 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473425 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473428 2574 flags.go:64] FLAG: --v="2" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473432 2574 flags.go:64] FLAG: --version="false" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473435 2574 flags.go:64] FLAG: --vmodule="" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473440 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.473443 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473546 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473551 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473554 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473557 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:28:46.477695 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473560 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473563 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473566 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473569 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473572 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473575 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473577 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473580 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473582 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473585 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473587 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473589 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473592 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473595 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473598 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473601 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473603 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473606 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473608 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473611 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:28:46.478266 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473613 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473615 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473618 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473620 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473623 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473625 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473628 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473630 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473633 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473635 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473638 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473641 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473643 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473646 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473649 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473651 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473654 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473657 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473660 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473663 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:28:46.478884 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473666 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473669 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473671 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473674 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473677 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473682 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473684 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473687 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473689 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473692 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473694 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473697 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473701 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473704 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473709 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473712 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473715 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473717 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473720 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:28:46.479397 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473723 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473725 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473728 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473730 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473733 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473736 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473738 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473741 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473743 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473746 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473748 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473751 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473753 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473756 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473758 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473761 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473763 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473766 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473768 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473771 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:28:46.479872 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473774 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:28:46.480364 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473776 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:28:46.480364 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.473779 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:28:46.480364 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.474436 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:28:46.480364 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.480293 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:28:46.480364 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.480309 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:28:46.480364 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480354 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:28:46.480364 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480359 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:28:46.480364 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480362 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:28:46.480364 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480366 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:28:46.480364 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480369 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480372 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480376 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480379 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480382 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480385 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480388 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480390 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480393 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480395 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480398 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480401 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480403 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480406 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480408 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480411 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480414 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480417 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480419 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:28:46.480611 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480422 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480424 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480427 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480429 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480432 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480434 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480437 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480439 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480443 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480445 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480448 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480450 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480453 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480456 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480458 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480461 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480464 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480467 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480469 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480472 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:28:46.481072 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480474 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480478 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480482 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480484 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480487 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480490 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480492 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480495 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480497 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480500 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480502 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480505 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480507 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480510 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480512 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480514 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480517 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480520 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480522 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:28:46.481579 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480525 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480528 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480532 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480535 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480537 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480540 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480542 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480545 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480548 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480551 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480553 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480556 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480558 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480561 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480563 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480566 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480568 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480571 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480573 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480576 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:28:46.482047 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480578 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:28:46.482587 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480581 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:28:46.482587 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480583 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:28:46.482587 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480586 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:28:46.482587 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.480591 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:28:46.482587 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480681 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:28:46.482587 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480685 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:28:46.482587 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480689 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:28:46.482587 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480691 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:28:46.482587 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480694 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:28:46.482587 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480697 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:28:46.482587 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480700 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:28:46.482587 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480703 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:28:46.482587 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480706 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:28:46.482587 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480708 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:28:46.482587 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480712 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:28:46.482587 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480714 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480717 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480720 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480724 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480728 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480732 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480735 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480738 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480740 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480743 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480754 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480757 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480759 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480762 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480764 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480767 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480770 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480772 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480775 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:28:46.482994 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480778 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480780 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480783 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480785 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480788 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480790 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480793 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480796 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480798 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480801 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480803 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480806 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480808 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480811 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480814 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480817 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480820 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480823 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480825 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480828 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:28:46.483456 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480831 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480834 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480836 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480839 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480841 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480844 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480846 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480849 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480852 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480854 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480856 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480859 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480861 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480864 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480866 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480869 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480871 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480874 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480877 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480879 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:28:46.483934 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480882 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:28:46.484412 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480884 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:28:46.484412 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480887 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:28:46.484412 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480889 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:28:46.484412 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480891 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:28:46.484412 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480894 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:28:46.484412 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480897 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:28:46.484412 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480899 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:28:46.484412 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480902 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:28:46.484412 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480905 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:28:46.484412 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480907 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:28:46.484412 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480911 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:28:46.484412 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480914 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:28:46.484412 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480917 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:28:46.484412 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480920 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:28:46.484412 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:46.480923 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:28:46.484412 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.480927 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:28:46.484792 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.481789 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:28:46.484792 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.483705 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:28:46.484792 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.484672 2574 server.go:1019] "Starting client certificate rotation" Apr 24 21:28:46.484792 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.484767 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:28:46.484901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.484808 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:28:46.510331 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.510315 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:28:46.516291 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.516269 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:28:46.530622 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.530595 2574 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:28:46.536128 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.536115 2574 log.go:25] "Validated CRI v1 image API" Apr 24 21:28:46.537350 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.537328 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:28:46.543186 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.543167 2574 fs.go:135] Filesystem UUIDs: map[4341f843-412b-4a4e-a084-5e11b7df9c90:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 a692ba9d-3006-4337-aa07-21f54ffb457f:/dev/nvme0n1p3] Apr 24 21:28:46.543267 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.543185 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:28:46.545881 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.545866 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:28:46.548752 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.548651 2574 manager.go:217] Machine: {Timestamp:2026-04-24 21:28:46.54668163 +0000 UTC m=+0.417659120 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096914 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec26260b77e03e1fd919b0c616ba63d0 SystemUUID:ec26260b-77e0-3e1f-d919-b0c616ba63d0 BootID:e00f3891-0245-4cd8-9b78-9e63009477f0 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6e:5c:62:57:3d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6e:5c:62:57:3d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:de:84:51:cf:c4:f8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:28:46.548752 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.548746 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:28:46.548853 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.548808 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:28:46.549847 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.549825 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:28:46.549998 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.549849 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-162.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:28:46.550045 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.550007 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:28:46.550045 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.550016 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:28:46.550045 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.550027 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:28:46.550836 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.550825 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:28:46.551544 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.551535 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:28:46.551638 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.551629 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:28:46.554280 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.554271 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:28:46.554361 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.554288 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:28:46.554361 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.554300 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:28:46.554361 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.554308 2574 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:28:46.554361 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.554316 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:28:46.555621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.555610 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:28:46.555658 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.555628 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:28:46.558661 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.558645 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:28:46.560095 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.560082 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:28:46.562456 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.562089 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:28:46.562522 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.562462 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:28:46.562522 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.562469 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:28:46.562522 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.562475 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:28:46.562522 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.562481 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:28:46.562522 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.562488 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:28:46.562522 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.562500 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:28:46.562522 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.562506 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:28:46.562522 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.562513 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:28:46.562522 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.562521 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:28:46.562761 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.562545 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:28:46.562761 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.562556 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:28:46.563490 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.563470 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:28:46.563576 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.563498 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:28:46.566896 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.566878 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-162.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:28:46.566969 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:46.566949 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-162.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:28:46.568133 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.568115 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:28:46.568133 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:46.568119 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:28:46.568255 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.568170 2574 server.go:1295] "Started kubelet" Apr 24 21:28:46.568321 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.568281 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:28:46.568395 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.568354 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:28:46.568446 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.568414 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:28:46.569012 ip-10-0-142-162 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:28:46.569555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.569444 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:28:46.570336 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.570323 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:28:46.574825 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.574805 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:28:46.574904 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.574823 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:28:46.575048 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:46.573932 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-162.ec2.internal.18a9683a1079139e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-162.ec2.internal,UID:ip-10-0-142-162.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-162.ec2.internal,},FirstTimestamp:2026-04-24 21:28:46.568133534 +0000 UTC m=+0.439111029,LastTimestamp:2026-04-24 21:28:46.568133534 +0000 UTC m=+0.439111029,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-162.ec2.internal,}" Apr 24 21:28:46.575474 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.575455 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:28:46.575558 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.575541 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:28:46.575639 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.575564 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:28:46.575700 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.575681 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:28:46.575700 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.575691 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:28:46.575794 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.575779 2574 factory.go:55] Registering systemd factory Apr 24 21:28:46.575857 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.575845 2574 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:28:46.575908 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:46.575860 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-162.ec2.internal\" not found" Apr 24 21:28:46.577135 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:46.577113 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:28:46.577548 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.577529 2574 factory.go:153] Registering CRI-O factory Apr 24 21:28:46.577548 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.577550 2574 factory.go:223] Registration of the crio container factory successfully Apr 24 21:28:46.577671 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.577613 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:28:46.577671 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.577635 2574 factory.go:103] Registering Raw factory Apr 24 21:28:46.577671 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.577648 2574 manager.go:1196] Started watching for new ooms in manager Apr 24 21:28:46.577883 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:46.577854 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-162.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:28:46.578179 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.578035 2574 manager.go:319] Starting recovery of all containers Apr 24 21:28:46.578735 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:46.578685 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:28:46.587506 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.587492 2574 manager.go:324] Recovery completed Apr 24 21:28:46.590964 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.590933 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wtc9f" Apr 24 21:28:46.592573 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.592560 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:28:46.594701 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.594686 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:28:46.594771 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.594714 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:28:46.594771 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.594725 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:28:46.595173 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.595158 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:28:46.595173 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.595171 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:28:46.595284 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.595186 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:28:46.597166 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.597155 2574 policy_none.go:49] "None policy: Start" Apr 24 21:28:46.597197 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.597169 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:28:46.597197 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.597179 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:28:46.599100 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.599078 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wtc9f" Apr 24 21:28:46.634749 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.634737 2574 manager.go:341] "Starting Device Plugin manager" Apr 24 21:28:46.643807 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:46.634763 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:28:46.643807 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.634773 2574 server.go:85] "Starting device plugin registration server" Apr 24 21:28:46.643807 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.634967 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:28:46.643807 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.634976 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:28:46.643807 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.635052 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:28:46.643807 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.635118 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:28:46.643807 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.635127 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:28:46.643807 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:46.635641 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:28:46.643807 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:46.635667 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-162.ec2.internal\" not found" Apr 24 21:28:46.709894 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.709873 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:28:46.710981 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.710963 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:28:46.711069 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.710988 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:28:46.711069 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.711004 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:28:46.711069 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.711013 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:28:46.711069 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:46.711048 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:28:46.714144 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.714100 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:28:46.735915 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.735901 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:28:46.736618 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.736602 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:28:46.736681 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.736627 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:28:46.736681 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.736637 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:28:46.736681 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.736657 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-162.ec2.internal" Apr 24 21:28:46.743590 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.743570 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-162.ec2.internal" Apr 24 21:28:46.743668 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:46.743592 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-162.ec2.internal\": node \"ip-10-0-142-162.ec2.internal\" not found" Apr 24 21:28:46.757004 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:46.756982 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-162.ec2.internal\" not found" Apr 24 21:28:46.811472 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.811451 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-162.ec2.internal"] Apr 24 21:28:46.811546 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.811532 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:28:46.812323 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.812310 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:28:46.812387 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.812338 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:28:46.812387 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.812352 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:28:46.814482 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.814469 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:28:46.814643 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.814629 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal" Apr 24 21:28:46.814712 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.814696 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:28:46.815107 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.815089 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:28:46.815183 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.815114 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:28:46.815183 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.815097 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:28:46.815183 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.815150 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:28:46.815183 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.815161 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:28:46.815331 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.815129 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:28:46.817322 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.817309 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-162.ec2.internal" Apr 24 21:28:46.817379 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.817332 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:28:46.817979 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.817956 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:28:46.818053 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.817990 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:28:46.818053 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.818006 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:28:46.843792 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:46.843771 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-162.ec2.internal\" not found" node="ip-10-0-142-162.ec2.internal" Apr 24 21:28:46.847879 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:46.847862 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-162.ec2.internal\" not found" node="ip-10-0-142-162.ec2.internal" Apr 24 21:28:46.857421 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:46.857400 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-162.ec2.internal\" not found" Apr 24 21:28:46.877713 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.877698 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2bbc7de608d73d2f4b55b275589c89ee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal\" (UID: \"2bbc7de608d73d2f4b55b275589c89ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal" Apr 24 21:28:46.877803 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.877728 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bbc7de608d73d2f4b55b275589c89ee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal\" (UID: \"2bbc7de608d73d2f4b55b275589c89ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal" Apr 24 21:28:46.877803 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.877752 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a49fc3c8bc7b82f59a5e7858c439f11d-config\") pod \"kube-apiserver-proxy-ip-10-0-142-162.ec2.internal\" (UID: \"a49fc3c8bc7b82f59a5e7858c439f11d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-162.ec2.internal" Apr 24 21:28:46.957819 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:46.957799 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-162.ec2.internal\" not found" Apr 24 21:28:46.978221 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.978179 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a49fc3c8bc7b82f59a5e7858c439f11d-config\") pod \"kube-apiserver-proxy-ip-10-0-142-162.ec2.internal\" (UID: \"a49fc3c8bc7b82f59a5e7858c439f11d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-162.ec2.internal" Apr 24 21:28:46.978221 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.978207 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2bbc7de608d73d2f4b55b275589c89ee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal\" (UID: \"2bbc7de608d73d2f4b55b275589c89ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal" Apr 24 21:28:46.978330 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.978238 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bbc7de608d73d2f4b55b275589c89ee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal\" (UID: \"2bbc7de608d73d2f4b55b275589c89ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal" Apr 24 21:28:46.978330 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.978282 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2bbc7de608d73d2f4b55b275589c89ee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal\" (UID: \"2bbc7de608d73d2f4b55b275589c89ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal" Apr 24 21:28:46.978330 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.978288 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a49fc3c8bc7b82f59a5e7858c439f11d-config\") pod \"kube-apiserver-proxy-ip-10-0-142-162.ec2.internal\" (UID: \"a49fc3c8bc7b82f59a5e7858c439f11d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-162.ec2.internal" Apr 24 21:28:46.978330 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:46.978328 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bbc7de608d73d2f4b55b275589c89ee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal\" (UID: \"2bbc7de608d73d2f4b55b275589c89ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal" Apr 24 21:28:47.058486 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:47.058463 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-162.ec2.internal\" not found" Apr 24 21:28:47.144926 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:47.144902 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal" Apr 24 21:28:47.150293 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:47.150277 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-162.ec2.internal" Apr 24 21:28:47.158867 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:47.158849 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-162.ec2.internal\" not found" Apr 24 21:28:47.259372 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:47.259322 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-162.ec2.internal\" not found" Apr 24 21:28:47.359793 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:47.359770 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-162.ec2.internal\" not found" Apr 24 21:28:47.460276 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:47.460247 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-162.ec2.internal\" not found" Apr 24 21:28:47.484467 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:47.484445 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:28:47.485063 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:47.484571 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:28:47.561052 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:47.561005 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-162.ec2.internal\" not found" Apr 24 21:28:47.575009 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:47.574986 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:28:47.585990 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:47.585972 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:28:47.602438 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:47.602393 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:23:46 +0000 UTC" deadline="2027-11-11 01:51:03.04099789 +0000 UTC" Apr 24 21:28:47.602438 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:47.602439 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13564h22m15.438562763s" Apr 24 21:28:47.609114 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:47.609096 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-5dwl7" Apr 24 21:28:47.617260 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:47.617241 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-5dwl7" Apr 24 21:28:47.636380 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:47.636363 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:28:47.661099 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:47.661075 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-162.ec2.internal\" not found" Apr 24 21:28:47.681123 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:47.681107 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:28:47.707044 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:47.707016 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda49fc3c8bc7b82f59a5e7858c439f11d.slice/crio-bc3c05e1799d035554cc72b8f9d3a76901429c37d8816405d569556dee28ae0f WatchSource:0}: Error finding container bc3c05e1799d035554cc72b8f9d3a76901429c37d8816405d569556dee28ae0f: Status 404 returned error can't find the container with id bc3c05e1799d035554cc72b8f9d3a76901429c37d8816405d569556dee28ae0f Apr 24 21:28:47.707507 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:47.707493 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bbc7de608d73d2f4b55b275589c89ee.slice/crio-96f6d006ceeb50a948fae805924ff012ac961467976b62bf1b7b1132aa8d5ee6 WatchSource:0}: Error finding container 96f6d006ceeb50a948fae805924ff012ac961467976b62bf1b7b1132aa8d5ee6: Status 404 returned error can't find the container with id 96f6d006ceeb50a948fae805924ff012ac961467976b62bf1b7b1132aa8d5ee6 Apr 24 21:28:47.712302 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:47.712285 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:28:47.714759 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:47.713774 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal" event={"ID":"2bbc7de608d73d2f4b55b275589c89ee","Type":"ContainerStarted","Data":"96f6d006ceeb50a948fae805924ff012ac961467976b62bf1b7b1132aa8d5ee6"} Apr 24 21:28:47.715956 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:47.715931 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-162.ec2.internal" event={"ID":"a49fc3c8bc7b82f59a5e7858c439f11d","Type":"ContainerStarted","Data":"bc3c05e1799d035554cc72b8f9d3a76901429c37d8816405d569556dee28ae0f"} Apr 24 21:28:47.761551 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:47.761529 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-162.ec2.internal\" not found" Apr 24 21:28:47.862025 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:47.861971 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-162.ec2.internal\" not found" Apr 24 21:28:47.962505 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:47.962478 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-162.ec2.internal\" not found" Apr 24 21:28:47.996934 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:47.996914 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:28:48.075887 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.075858 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal" Apr 24 21:28:48.087188 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.087163 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:28:48.088174 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.088153 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-162.ec2.internal" Apr 24 21:28:48.096268 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.096252 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:28:48.555276 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.555248 2574 apiserver.go:52] "Watching apiserver" Apr 24 21:28:48.561826 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.561803 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:28:48.562209 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.562189 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-qqsnz","openshift-multus/network-metrics-daemon-tdnnb","openshift-network-operator/iptables-alerter-d8cnv","kube-system/konnectivity-agent-q8sjc","openshift-cluster-node-tuning-operator/tuned-fkzj2","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal","openshift-network-diagnostics/network-check-target-vtshd","openshift-ovn-kubernetes/ovnkube-node-256cw","kube-system/kube-apiserver-proxy-ip-10-0-142-162.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts","openshift-dns/node-resolver-cwvxk","openshift-image-registry/node-ca-cdrzz","openshift-multus/multus-additional-cni-plugins-wn7sd"] Apr 24 21:28:48.565303 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.565276 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:28:48.565400 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:48.565376 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vtshd" podUID="ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374" Apr 24 21:28:48.567441 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.567424 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:28:48.567508 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:48.567480 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnnb" podUID="fba6f53a-a544-4d53-ba11-2dd3b3259ed0" Apr 24 21:28:48.569538 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.569513 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-d8cnv" Apr 24 21:28:48.571491 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.571448 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:28:48.571581 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.571500 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-f4t5m\"" Apr 24 21:28:48.571581 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.571571 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:28:48.571904 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.571829 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:28:48.571904 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.571894 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q8sjc" Apr 24 21:28:48.573676 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.573660 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:28:48.573764 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.573708 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-v6jc2\"" Apr 24 21:28:48.573909 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.573889 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:28:48.576520 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.576201 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.576520 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.576301 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.578362 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.578189 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-hf4gz\"" Apr 24 21:28:48.578499 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.578455 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:28:48.578703 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.578555 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-j6dh8\"" Apr 24 21:28:48.578703 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.578650 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:28:48.578908 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.578892 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:28:48.578997 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.578976 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:28:48.578997 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.578991 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:28:48.579157 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.579068 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:28:48.580968 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.580930 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.584636 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.584558 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:28:48.584812 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.584793 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-mk6gs\"" Apr 24 21:28:48.584906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.584839 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:28:48.584906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.584859 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:28:48.585027 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.584944 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:28:48.585027 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.584972 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.585140 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.585055 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:28:48.585140 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.585067 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cwvxk" Apr 24 21:28:48.585396 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.585375 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:28:48.587129 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587043 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-run-ovn-kubernetes\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.587129 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587092 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-multus-socket-dir-parent\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.587294 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587117 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-host-run-k8s-cni-cncf-io\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.587294 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587159 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/39fba077-f532-47c2-b634-29e01862bef6-host-slash\") pod \"iptables-alerter-d8cnv\" (UID: \"39fba077-f532-47c2-b634-29e01862bef6\") " pod="openshift-network-operator/iptables-alerter-d8cnv" Apr 24 21:28:48.587294 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587187 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wcnz\" (UniqueName: \"kubernetes.io/projected/39fba077-f532-47c2-b634-29e01862bef6-kube-api-access-6wcnz\") pod \"iptables-alerter-d8cnv\" (UID: \"39fba077-f532-47c2-b634-29e01862bef6\") " pod="openshift-network-operator/iptables-alerter-d8cnv" Apr 24 21:28:48.587294 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587210 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-systemd-units\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.587294 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587250 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-cni-netd\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.587294 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587274 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-multus-conf-dir\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.587621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587298 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:28:48.587621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587299 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p9xn\" (UniqueName: \"kubernetes.io/projected/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-kube-api-access-9p9xn\") pod \"network-metrics-daemon-tdnnb\" (UID: \"fba6f53a-a544-4d53-ba11-2dd3b3259ed0\") " pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:28:48.587621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587520 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-slash\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.587621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587539 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:28:48.587621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587548 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-run-netns\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.587621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587576 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.587621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587605 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-modprobe-d\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.587995 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587625 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:28:48.587995 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587625 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4nqjp\"" Apr 24 21:28:48.587995 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587628 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-sysctl-conf\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.587995 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587698 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-host-run-multus-certs\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.587995 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587715 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:28:48.587995 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587723 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/39fba077-f532-47c2-b634-29e01862bef6-iptables-alerter-script\") pod \"iptables-alerter-d8cnv\" (UID: \"39fba077-f532-47c2-b634-29e01862bef6\") " pod="openshift-network-operator/iptables-alerter-d8cnv" Apr 24 21:28:48.587995 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587748 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg8lq\" (UniqueName: \"kubernetes.io/projected/5a521b1a-3dde-4f1e-aa52-3728d09e9921-kube-api-access-vg8lq\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.587995 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587772 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-etc-kubernetes\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.587995 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587794 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npwfg\" (UniqueName: \"kubernetes.io/projected/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-kube-api-access-npwfg\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.587995 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587817 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:28:48.587995 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587821 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-pgp7d\"" Apr 24 21:28:48.587995 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587819 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-kubelet\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.587995 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587901 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-sysconfig\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.587995 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587929 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-host-run-netns\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.587995 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587955 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-host-var-lib-cni-bin\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.587995 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.587978 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-cni-bin\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.587995 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588000 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-cnibin\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.588780 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588023 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-os-release\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.588780 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588043 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs\") pod \"network-metrics-daemon-tdnnb\" (UID: \"fba6f53a-a544-4d53-ba11-2dd3b3259ed0\") " pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:28:48.588780 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588065 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-run-systemd\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.588780 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588083 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e1b294d-b645-40e3-b659-41031123c7f2-env-overrides\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.588780 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588097 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-systemd\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.588780 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588113 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-host-var-lib-kubelet\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.588780 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588129 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-etc-openvswitch\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.588780 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588151 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-kubernetes\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.588780 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588216 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-system-cni-dir\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.588780 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588308 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-multus-cni-dir\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.588780 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588349 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-var-lib-openvswitch\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.588780 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588373 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-run-openvswitch\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.588780 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588396 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e1b294d-b645-40e3-b659-41031123c7f2-ovn-node-metrics-cert\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.588780 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588419 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3e1b294d-b645-40e3-b659-41031123c7f2-ovnkube-script-lib\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.588780 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588448 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nh9d\" (UniqueName: \"kubernetes.io/projected/3e1b294d-b645-40e3-b659-41031123c7f2-kube-api-access-9nh9d\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.588780 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588471 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f8b3ba0e-889f-4f1c-9e20-33df1e811158-konnectivity-ca\") pod \"konnectivity-agent-q8sjc\" (UID: \"f8b3ba0e-889f-4f1c-9e20-33df1e811158\") " pod="kube-system/konnectivity-agent-q8sjc" Apr 24 21:28:48.588780 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588492 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-host\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.589571 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588520 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a521b1a-3dde-4f1e-aa52-3728d09e9921-tmp\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.589571 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588543 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-run-ovn\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.589571 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588565 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-log-socket\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.589571 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588589 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-multus-daemon-config\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.589571 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588623 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f8b3ba0e-889f-4f1c-9e20-33df1e811158-agent-certs\") pod \"konnectivity-agent-q8sjc\" (UID: \"f8b3ba0e-889f-4f1c-9e20-33df1e811158\") " pod="kube-system/konnectivity-agent-q8sjc" Apr 24 21:28:48.589571 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588653 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-run\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.589571 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588696 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-lib-modules\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.589571 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588724 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-tuned\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.589571 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588745 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-hostroot\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.589571 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588770 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j2qz\" (UniqueName: \"kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz\") pod \"network-check-target-vtshd\" (UID: \"ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374\") " pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:28:48.589571 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588795 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-node-log\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.589571 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588817 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e1b294d-b645-40e3-b659-41031123c7f2-ovnkube-config\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.589571 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588838 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-sysctl-d\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.589571 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588860 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-sys\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.589571 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588925 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-var-lib-kubelet\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.589571 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588962 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-cni-binary-copy\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.589571 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.588981 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-host-var-lib-cni-multus\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.590374 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.589896 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cdrzz" Apr 24 21:28:48.590374 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.589952 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.591830 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.591812 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:28:48.591938 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.591884 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zsbhm\"" Apr 24 21:28:48.592093 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.592069 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:28:48.592167 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.592115 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:28:48.592241 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.592184 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:28:48.592304 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.592240 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zlgt8\"" Apr 24 21:28:48.592348 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.592334 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:28:48.617830 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.617800 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:23:47 +0000 UTC" deadline="2027-10-23 00:26:09.175485055 +0000 UTC" Apr 24 21:28:48.617830 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.617829 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13106h57m20.557659354s" Apr 24 21:28:48.676514 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.676487 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:28:48.689724 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.689691 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-node-log\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.689851 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.689733 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e1b294d-b645-40e3-b659-41031123c7f2-ovnkube-config\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.689851 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.689759 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-sysctl-d\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.689851 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.689783 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-sys\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.689851 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.689817 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-node-log\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.689851 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.689831 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-var-lib-kubelet\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.690098 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.689856 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-cni-binary-copy\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.690098 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.689878 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-var-lib-kubelet\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.690098 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.689853 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-sys\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.690098 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.689891 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-host-var-lib-cni-multus\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.690098 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.689910 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-sysctl-d\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.690098 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.689928 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-run-ovn-kubernetes\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.690098 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.689949 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-multus-socket-dir-parent\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.690098 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.689950 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-host-var-lib-cni-multus\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.690098 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.689966 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-host-run-k8s-cni-cncf-io\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.690098 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.689990 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/39fba077-f532-47c2-b634-29e01862bef6-host-slash\") pod \"iptables-alerter-d8cnv\" (UID: \"39fba077-f532-47c2-b634-29e01862bef6\") " pod="openshift-network-operator/iptables-alerter-d8cnv" Apr 24 21:28:48.690098 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690010 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-run-ovn-kubernetes\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.690098 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690014 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wcnz\" (UniqueName: \"kubernetes.io/projected/39fba077-f532-47c2-b634-29e01862bef6-kube-api-access-6wcnz\") pod \"iptables-alerter-d8cnv\" (UID: \"39fba077-f532-47c2-b634-29e01862bef6\") " pod="openshift-network-operator/iptables-alerter-d8cnv" Apr 24 21:28:48.690098 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690045 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-multus-socket-dir-parent\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.690098 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690069 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/39fba077-f532-47c2-b634-29e01862bef6-host-slash\") pod \"iptables-alerter-d8cnv\" (UID: \"39fba077-f532-47c2-b634-29e01862bef6\") " pod="openshift-network-operator/iptables-alerter-d8cnv" Apr 24 21:28:48.690722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690112 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-systemd-units\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.690722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690141 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-cni-netd\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.690722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690164 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-systemd-units\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.690722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690169 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-multus-conf-dir\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.690722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690206 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-multus-conf-dir\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.690722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690210 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnclq\" (UniqueName: \"kubernetes.io/projected/2fc5cade-b0b3-414a-88b0-ae3c0348001f-kube-api-access-cnclq\") pod \"node-ca-cdrzz\" (UID: \"2fc5cade-b0b3-414a-88b0-ae3c0348001f\") " pod="openshift-image-registry/node-ca-cdrzz" Apr 24 21:28:48.690722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690259 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-cni-netd\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.690722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690264 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9p9xn\" (UniqueName: \"kubernetes.io/projected/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-kube-api-access-9p9xn\") pod \"network-metrics-daemon-tdnnb\" (UID: \"fba6f53a-a544-4d53-ba11-2dd3b3259ed0\") " pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:28:48.690722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-slash\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.690722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690312 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-host-run-k8s-cni-cncf-io\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.690722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690324 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-run-netns\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.690722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690353 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.690722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690365 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-slash\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.690722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690379 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-modprobe-d\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.690722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690403 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-run-netns\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.690722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690420 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.690722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690420 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-cni-binary-copy\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.691555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690420 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e1b294d-b645-40e3-b659-41031123c7f2-ovnkube-config\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.691555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690486 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-sysctl-conf\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.691555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690482 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-modprobe-d\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.691555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690515 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-host-run-multus-certs\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.691555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690541 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/39fba077-f532-47c2-b634-29e01862bef6-iptables-alerter-script\") pod \"iptables-alerter-d8cnv\" (UID: \"39fba077-f532-47c2-b634-29e01862bef6\") " pod="openshift-network-operator/iptables-alerter-d8cnv" Apr 24 21:28:48.691555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690565 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/da82016d-3774-4430-881a-6479d2a7aa8c-hosts-file\") pod \"node-resolver-cwvxk\" (UID: \"da82016d-3774-4430-881a-6479d2a7aa8c\") " pod="openshift-dns/node-resolver-cwvxk" Apr 24 21:28:48.691555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690590 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vg8lq\" (UniqueName: \"kubernetes.io/projected/5a521b1a-3dde-4f1e-aa52-3728d09e9921-kube-api-access-vg8lq\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.691555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690594 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-host-run-multus-certs\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.691555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690633 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-sysctl-conf\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.691555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690760 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znm9l\" (UniqueName: \"kubernetes.io/projected/bf6a1a97-e9a0-4091-b077-931e1415d0c5-kube-api-access-znm9l\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.691555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690794 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpfp6\" (UniqueName: \"kubernetes.io/projected/244284fa-4acf-45db-bf3a-c7bcd19a6b80-kube-api-access-gpfp6\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.691555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690820 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/da82016d-3774-4430-881a-6479d2a7aa8c-tmp-dir\") pod \"node-resolver-cwvxk\" (UID: \"da82016d-3774-4430-881a-6479d2a7aa8c\") " pod="openshift-dns/node-resolver-cwvxk" Apr 24 21:28:48.691555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690864 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-etc-kubernetes\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.691555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690882 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-etc-kubernetes\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.691555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690899 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npwfg\" (UniqueName: \"kubernetes.io/projected/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-kube-api-access-npwfg\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.691555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690927 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-kubelet\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.691555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690952 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-sysconfig\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.692328 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690974 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-host-run-netns\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.692328 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.690999 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-host-var-lib-cni-bin\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.692328 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691028 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2fc5cade-b0b3-414a-88b0-ae3c0348001f-host\") pod \"node-ca-cdrzz\" (UID: \"2fc5cade-b0b3-414a-88b0-ae3c0348001f\") " pod="openshift-image-registry/node-ca-cdrzz" Apr 24 21:28:48.692328 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691034 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-sysconfig\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.692328 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691042 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-host-var-lib-cni-bin\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.692328 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691045 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-host-run-netns\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.692328 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691055 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/244284fa-4acf-45db-bf3a-c7bcd19a6b80-etc-selinux\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.692328 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691084 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-kubelet\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.692328 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691089 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-cni-bin\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.692328 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691126 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-host-cni-bin\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.692328 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691139 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/39fba077-f532-47c2-b634-29e01862bef6-iptables-alerter-script\") pod \"iptables-alerter-d8cnv\" (UID: \"39fba077-f532-47c2-b634-29e01862bef6\") " pod="openshift-network-operator/iptables-alerter-d8cnv" Apr 24 21:28:48.692328 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691147 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-cnibin\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.692328 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691210 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-cnibin\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.692328 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691268 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-os-release\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.692328 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691302 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs\") pod \"network-metrics-daemon-tdnnb\" (UID: \"fba6f53a-a544-4d53-ba11-2dd3b3259ed0\") " pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:28:48.692328 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691352 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf6a1a97-e9a0-4091-b077-931e1415d0c5-system-cni-dir\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.692328 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691362 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-os-release\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.692328 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691384 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bf6a1a97-e9a0-4091-b077-931e1415d0c5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.693170 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:48.691401 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:48.693170 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691410 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/244284fa-4acf-45db-bf3a-c7bcd19a6b80-sys-fs\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.693170 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691443 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-run-systemd\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.693170 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:48.691458 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs podName:fba6f53a-a544-4d53-ba11-2dd3b3259ed0 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:49.191435332 +0000 UTC m=+3.062412809 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs") pod "network-metrics-daemon-tdnnb" (UID: "fba6f53a-a544-4d53-ba11-2dd3b3259ed0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:48.693170 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691486 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-run-systemd\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.693170 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691520 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e1b294d-b645-40e3-b659-41031123c7f2-env-overrides\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.693170 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691539 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-systemd\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.693170 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691596 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-host-var-lib-kubelet\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.693170 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691627 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-systemd\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.693170 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691651 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-host-var-lib-kubelet\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.693170 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691656 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/244284fa-4acf-45db-bf3a-c7bcd19a6b80-registration-dir\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.693170 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691685 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-etc-openvswitch\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.693170 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691705 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-kubernetes\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.693170 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691725 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-system-cni-dir\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.693170 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691748 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-multus-cni-dir\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.693170 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691772 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-var-lib-openvswitch\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.693170 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691771 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-etc-openvswitch\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.693980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691795 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-run-openvswitch\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.693980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691785 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-kubernetes\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.693980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691807 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-system-cni-dir\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.693980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691823 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-var-lib-openvswitch\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.693980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691830 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-multus-cni-dir\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.693980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691852 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e1b294d-b645-40e3-b659-41031123c7f2-ovn-node-metrics-cert\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.693980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691874 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-run-openvswitch\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.693980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691880 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3e1b294d-b645-40e3-b659-41031123c7f2-ovnkube-script-lib\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.693980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691909 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2fc5cade-b0b3-414a-88b0-ae3c0348001f-serviceca\") pod \"node-ca-cdrzz\" (UID: \"2fc5cade-b0b3-414a-88b0-ae3c0348001f\") " pod="openshift-image-registry/node-ca-cdrzz" Apr 24 21:28:48.693980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691933 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e1b294d-b645-40e3-b659-41031123c7f2-env-overrides\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.693980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.691937 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf6a1a97-e9a0-4091-b077-931e1415d0c5-cnibin\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.693980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692052 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf6a1a97-e9a0-4091-b077-931e1415d0c5-cni-binary-copy\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.693980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692083 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf6a1a97-e9a0-4091-b077-931e1415d0c5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.693980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692105 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nh9d\" (UniqueName: \"kubernetes.io/projected/3e1b294d-b645-40e3-b659-41031123c7f2-kube-api-access-9nh9d\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.693980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692127 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f8b3ba0e-889f-4f1c-9e20-33df1e811158-konnectivity-ca\") pod \"konnectivity-agent-q8sjc\" (UID: \"f8b3ba0e-889f-4f1c-9e20-33df1e811158\") " pod="kube-system/konnectivity-agent-q8sjc" Apr 24 21:28:48.693980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692152 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-host\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.693980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692175 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a521b1a-3dde-4f1e-aa52-3728d09e9921-tmp\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.694800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692200 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf6a1a97-e9a0-4091-b077-931e1415d0c5-os-release\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.694800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692201 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:28:48.694800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692246 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bf6a1a97-e9a0-4091-b077-931e1415d0c5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.694800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692275 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvqjk\" (UniqueName: \"kubernetes.io/projected/da82016d-3774-4430-881a-6479d2a7aa8c-kube-api-access-bvqjk\") pod \"node-resolver-cwvxk\" (UID: \"da82016d-3774-4430-881a-6479d2a7aa8c\") " pod="openshift-dns/node-resolver-cwvxk" Apr 24 21:28:48.694800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692299 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-run-ovn\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.694800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692324 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-log-socket\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.694800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692348 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-multus-daemon-config\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.694800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692375 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/244284fa-4acf-45db-bf3a-c7bcd19a6b80-device-dir\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.694800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692402 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f8b3ba0e-889f-4f1c-9e20-33df1e811158-agent-certs\") pod \"konnectivity-agent-q8sjc\" (UID: \"f8b3ba0e-889f-4f1c-9e20-33df1e811158\") " pod="kube-system/konnectivity-agent-q8sjc" Apr 24 21:28:48.694800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692425 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-run\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.694800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692440 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3e1b294d-b645-40e3-b659-41031123c7f2-ovnkube-script-lib\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.694800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692449 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/244284fa-4acf-45db-bf3a-c7bcd19a6b80-kubelet-dir\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.694800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692579 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/244284fa-4acf-45db-bf3a-c7bcd19a6b80-socket-dir\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.694800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.692636 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-host\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.694800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.693260 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-run-ovn\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.694800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.693440 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-multus-daemon-config\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.694800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.693515 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-lib-modules\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.695621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.693563 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-tuned\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.695621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.693493 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-run\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.695621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.693456 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3e1b294d-b645-40e3-b659-41031123c7f2-log-socket\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.695621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.693485 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f8b3ba0e-889f-4f1c-9e20-33df1e811158-konnectivity-ca\") pod \"konnectivity-agent-q8sjc\" (UID: \"f8b3ba0e-889f-4f1c-9e20-33df1e811158\") " pod="kube-system/konnectivity-agent-q8sjc" Apr 24 21:28:48.695621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.693605 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-hostroot\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.695621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.693650 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-hostroot\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.695621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.693668 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2j2qz\" (UniqueName: \"kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz\") pod \"network-check-target-vtshd\" (UID: \"ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374\") " pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:28:48.695621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.693675 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a521b1a-3dde-4f1e-aa52-3728d09e9921-lib-modules\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.695621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.695568 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a521b1a-3dde-4f1e-aa52-3728d09e9921-tmp\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.695960 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.695686 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e1b294d-b645-40e3-b659-41031123c7f2-ovn-node-metrics-cert\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.696119 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.696101 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5a521b1a-3dde-4f1e-aa52-3728d09e9921-etc-tuned\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.696185 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.696135 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f8b3ba0e-889f-4f1c-9e20-33df1e811158-agent-certs\") pod \"konnectivity-agent-q8sjc\" (UID: \"f8b3ba0e-889f-4f1c-9e20-33df1e811158\") " pod="kube-system/konnectivity-agent-q8sjc" Apr 24 21:28:48.703044 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:48.703022 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:48.703044 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:48.703040 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:48.703200 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:48.703051 2574 projected.go:194] Error preparing data for projected volume kube-api-access-2j2qz for pod openshift-network-diagnostics/network-check-target-vtshd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:48.703200 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:48.703097 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz podName:ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:49.203086579 +0000 UTC m=+3.074064059 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2j2qz" (UniqueName: "kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz") pod "network-check-target-vtshd" (UID: "ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:48.704508 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.704486 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npwfg\" (UniqueName: \"kubernetes.io/projected/4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0-kube-api-access-npwfg\") pod \"multus-qqsnz\" (UID: \"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0\") " pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.704792 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.704775 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wcnz\" (UniqueName: \"kubernetes.io/projected/39fba077-f532-47c2-b634-29e01862bef6-kube-api-access-6wcnz\") pod \"iptables-alerter-d8cnv\" (UID: \"39fba077-f532-47c2-b634-29e01862bef6\") " pod="openshift-network-operator/iptables-alerter-d8cnv" Apr 24 21:28:48.704878 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.704824 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p9xn\" (UniqueName: \"kubernetes.io/projected/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-kube-api-access-9p9xn\") pod \"network-metrics-daemon-tdnnb\" (UID: \"fba6f53a-a544-4d53-ba11-2dd3b3259ed0\") " pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:28:48.704878 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.704827 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg8lq\" (UniqueName: \"kubernetes.io/projected/5a521b1a-3dde-4f1e-aa52-3728d09e9921-kube-api-access-vg8lq\") pod \"tuned-fkzj2\" (UID: \"5a521b1a-3dde-4f1e-aa52-3728d09e9921\") " pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.705617 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.705578 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nh9d\" (UniqueName: \"kubernetes.io/projected/3e1b294d-b645-40e3-b659-41031123c7f2-kube-api-access-9nh9d\") pod \"ovnkube-node-256cw\" (UID: \"3e1b294d-b645-40e3-b659-41031123c7f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.794289 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794260 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2fc5cade-b0b3-414a-88b0-ae3c0348001f-serviceca\") pod \"node-ca-cdrzz\" (UID: \"2fc5cade-b0b3-414a-88b0-ae3c0348001f\") " pod="openshift-image-registry/node-ca-cdrzz" Apr 24 21:28:48.794418 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794297 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf6a1a97-e9a0-4091-b077-931e1415d0c5-cnibin\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.794418 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794322 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf6a1a97-e9a0-4091-b077-931e1415d0c5-cni-binary-copy\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.794418 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794343 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf6a1a97-e9a0-4091-b077-931e1415d0c5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.794418 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794384 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf6a1a97-e9a0-4091-b077-931e1415d0c5-cnibin\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.794610 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794486 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf6a1a97-e9a0-4091-b077-931e1415d0c5-os-release\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.794610 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794531 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bf6a1a97-e9a0-4091-b077-931e1415d0c5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.794610 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794547 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf6a1a97-e9a0-4091-b077-931e1415d0c5-os-release\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.794610 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794493 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf6a1a97-e9a0-4091-b077-931e1415d0c5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.794610 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794559 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvqjk\" (UniqueName: \"kubernetes.io/projected/da82016d-3774-4430-881a-6479d2a7aa8c-kube-api-access-bvqjk\") pod \"node-resolver-cwvxk\" (UID: \"da82016d-3774-4430-881a-6479d2a7aa8c\") " pod="openshift-dns/node-resolver-cwvxk" Apr 24 21:28:48.794610 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794602 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/244284fa-4acf-45db-bf3a-c7bcd19a6b80-device-dir\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.794901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794631 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/244284fa-4acf-45db-bf3a-c7bcd19a6b80-kubelet-dir\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.794901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794651 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/244284fa-4acf-45db-bf3a-c7bcd19a6b80-socket-dir\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.794901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794703 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnclq\" (UniqueName: \"kubernetes.io/projected/2fc5cade-b0b3-414a-88b0-ae3c0348001f-kube-api-access-cnclq\") pod \"node-ca-cdrzz\" (UID: \"2fc5cade-b0b3-414a-88b0-ae3c0348001f\") " pod="openshift-image-registry/node-ca-cdrzz" Apr 24 21:28:48.794901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794716 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/244284fa-4acf-45db-bf3a-c7bcd19a6b80-device-dir\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.794901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794734 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/da82016d-3774-4430-881a-6479d2a7aa8c-hosts-file\") pod \"node-resolver-cwvxk\" (UID: \"da82016d-3774-4430-881a-6479d2a7aa8c\") " pod="openshift-dns/node-resolver-cwvxk" Apr 24 21:28:48.794901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794763 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znm9l\" (UniqueName: \"kubernetes.io/projected/bf6a1a97-e9a0-4091-b077-931e1415d0c5-kube-api-access-znm9l\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.794901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794767 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2fc5cade-b0b3-414a-88b0-ae3c0348001f-serviceca\") pod \"node-ca-cdrzz\" (UID: \"2fc5cade-b0b3-414a-88b0-ae3c0348001f\") " pod="openshift-image-registry/node-ca-cdrzz" Apr 24 21:28:48.794901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794787 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpfp6\" (UniqueName: \"kubernetes.io/projected/244284fa-4acf-45db-bf3a-c7bcd19a6b80-kube-api-access-gpfp6\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.794901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794813 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/da82016d-3774-4430-881a-6479d2a7aa8c-tmp-dir\") pod \"node-resolver-cwvxk\" (UID: \"da82016d-3774-4430-881a-6479d2a7aa8c\") " pod="openshift-dns/node-resolver-cwvxk" Apr 24 21:28:48.794901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794827 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/244284fa-4acf-45db-bf3a-c7bcd19a6b80-socket-dir\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.794901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794844 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2fc5cade-b0b3-414a-88b0-ae3c0348001f-host\") pod \"node-ca-cdrzz\" (UID: \"2fc5cade-b0b3-414a-88b0-ae3c0348001f\") " pod="openshift-image-registry/node-ca-cdrzz" Apr 24 21:28:48.794901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794865 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf6a1a97-e9a0-4091-b077-931e1415d0c5-cni-binary-copy\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.794901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794873 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/244284fa-4acf-45db-bf3a-c7bcd19a6b80-kubelet-dir\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.794901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794868 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/244284fa-4acf-45db-bf3a-c7bcd19a6b80-etc-selinux\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.795594 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794912 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/244284fa-4acf-45db-bf3a-c7bcd19a6b80-etc-selinux\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.795594 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794927 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf6a1a97-e9a0-4091-b077-931e1415d0c5-system-cni-dir\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.795594 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794932 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2fc5cade-b0b3-414a-88b0-ae3c0348001f-host\") pod \"node-ca-cdrzz\" (UID: \"2fc5cade-b0b3-414a-88b0-ae3c0348001f\") " pod="openshift-image-registry/node-ca-cdrzz" Apr 24 21:28:48.795594 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794954 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bf6a1a97-e9a0-4091-b077-931e1415d0c5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.795594 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794979 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/244284fa-4acf-45db-bf3a-c7bcd19a6b80-sys-fs\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.795594 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.794986 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/da82016d-3774-4430-881a-6479d2a7aa8c-hosts-file\") pod \"node-resolver-cwvxk\" (UID: \"da82016d-3774-4430-881a-6479d2a7aa8c\") " pod="openshift-dns/node-resolver-cwvxk" Apr 24 21:28:48.795594 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.795009 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/244284fa-4acf-45db-bf3a-c7bcd19a6b80-registration-dir\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.795594 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.795019 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf6a1a97-e9a0-4091-b077-931e1415d0c5-system-cni-dir\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.795594 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.795065 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bf6a1a97-e9a0-4091-b077-931e1415d0c5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.795594 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.795077 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/244284fa-4acf-45db-bf3a-c7bcd19a6b80-registration-dir\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.795594 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.795088 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/da82016d-3774-4430-881a-6479d2a7aa8c-tmp-dir\") pod \"node-resolver-cwvxk\" (UID: \"da82016d-3774-4430-881a-6479d2a7aa8c\") " pod="openshift-dns/node-resolver-cwvxk" Apr 24 21:28:48.795594 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.795071 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/244284fa-4acf-45db-bf3a-c7bcd19a6b80-sys-fs\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.795594 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.795410 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bf6a1a97-e9a0-4091-b077-931e1415d0c5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.804170 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.804142 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvqjk\" (UniqueName: \"kubernetes.io/projected/da82016d-3774-4430-881a-6479d2a7aa8c-kube-api-access-bvqjk\") pod \"node-resolver-cwvxk\" (UID: \"da82016d-3774-4430-881a-6479d2a7aa8c\") " pod="openshift-dns/node-resolver-cwvxk" Apr 24 21:28:48.804365 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.804348 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnclq\" (UniqueName: \"kubernetes.io/projected/2fc5cade-b0b3-414a-88b0-ae3c0348001f-kube-api-access-cnclq\") pod \"node-ca-cdrzz\" (UID: \"2fc5cade-b0b3-414a-88b0-ae3c0348001f\") " pod="openshift-image-registry/node-ca-cdrzz" Apr 24 21:28:48.804365 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.804361 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znm9l\" (UniqueName: \"kubernetes.io/projected/bf6a1a97-e9a0-4091-b077-931e1415d0c5-kube-api-access-znm9l\") pod \"multus-additional-cni-plugins-wn7sd\" (UID: \"bf6a1a97-e9a0-4091-b077-931e1415d0c5\") " pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:48.804460 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.804376 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpfp6\" (UniqueName: \"kubernetes.io/projected/244284fa-4acf-45db-bf3a-c7bcd19a6b80-kube-api-access-gpfp6\") pod \"aws-ebs-csi-driver-node-j2fts\" (UID: \"244284fa-4acf-45db-bf3a-c7bcd19a6b80\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.885192 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.885132 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-d8cnv" Apr 24 21:28:48.890727 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.890707 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q8sjc" Apr 24 21:28:48.899452 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.899435 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" Apr 24 21:28:48.903983 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.903962 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qqsnz" Apr 24 21:28:48.910583 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.910566 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:28:48.917171 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.917155 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" Apr 24 21:28:48.923700 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.923679 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cwvxk" Apr 24 21:28:48.931207 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.931189 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cdrzz" Apr 24 21:28:48.936710 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:48.936689 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wn7sd" Apr 24 21:28:49.082549 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:49.082524 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:28:49.197980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:49.197922 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs\") pod \"network-metrics-daemon-tdnnb\" (UID: \"fba6f53a-a544-4d53-ba11-2dd3b3259ed0\") " pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:28:49.198101 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:49.198057 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:49.198147 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:49.198117 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs podName:fba6f53a-a544-4d53-ba11-2dd3b3259ed0 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:50.198099085 +0000 UTC m=+4.069076563 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs") pod "network-metrics-daemon-tdnnb" (UID: "fba6f53a-a544-4d53-ba11-2dd3b3259ed0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:49.298435 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:49.298405 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2j2qz\" (UniqueName: \"kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz\") pod \"network-check-target-vtshd\" (UID: \"ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374\") " pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:28:49.298585 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:49.298560 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:49.298585 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:49.298575 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:49.298585 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:49.298584 2574 projected.go:194] Error preparing data for projected volume kube-api-access-2j2qz for pod openshift-network-diagnostics/network-check-target-vtshd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:49.298748 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:49.298639 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz podName:ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:50.298622133 +0000 UTC m=+4.169599621 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2j2qz" (UniqueName: "kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz") pod "network-check-target-vtshd" (UID: "ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:49.354058 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:49.353906 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39fba077_f532_47c2_b634_29e01862bef6.slice/crio-377738b0e9540495f43b1ca8962d8ff0366b6ef19ef1d5be928fbd0858c3bebc WatchSource:0}: Error finding container 377738b0e9540495f43b1ca8962d8ff0366b6ef19ef1d5be928fbd0858c3bebc: Status 404 returned error can't find the container with id 377738b0e9540495f43b1ca8962d8ff0366b6ef19ef1d5be928fbd0858c3bebc Apr 24 21:28:49.355194 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:49.355172 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dd2bc89_9e9b_4a45_b5fb_585ad0a71cd0.slice/crio-e047f3eb0099157fc37010938135a6209691ee2c9fa3db9af91f564a3d3730c6 WatchSource:0}: Error finding container e047f3eb0099157fc37010938135a6209691ee2c9fa3db9af91f564a3d3730c6: Status 404 returned error can't find the container with id e047f3eb0099157fc37010938135a6209691ee2c9fa3db9af91f564a3d3730c6 Apr 24 21:28:49.359243 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:49.359201 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda82016d_3774_4430_881a_6479d2a7aa8c.slice/crio-a51706432ce88cd1a812768bf31ddea30fabe433c2ad8ffc2c2c84c1a8871e81 WatchSource:0}: Error finding container a51706432ce88cd1a812768bf31ddea30fabe433c2ad8ffc2c2c84c1a8871e81: Status 404 returned error can't find the container with id a51706432ce88cd1a812768bf31ddea30fabe433c2ad8ffc2c2c84c1a8871e81 Apr 24 21:28:49.359929 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:49.359905 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e1b294d_b645_40e3_b659_41031123c7f2.slice/crio-c53a1e4114dff5322cf4ea998b3207a57d79c4ded05db4d9d6c2cfe46b309439 WatchSource:0}: Error finding container c53a1e4114dff5322cf4ea998b3207a57d79c4ded05db4d9d6c2cfe46b309439: Status 404 returned error can't find the container with id c53a1e4114dff5322cf4ea998b3207a57d79c4ded05db4d9d6c2cfe46b309439 Apr 24 21:28:49.360590 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:49.360546 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8b3ba0e_889f_4f1c_9e20_33df1e811158.slice/crio-3fe0dedab5dcb956806c5217127317994d44ee7986489fee455bbf25089df93e WatchSource:0}: Error finding container 3fe0dedab5dcb956806c5217127317994d44ee7986489fee455bbf25089df93e: Status 404 returned error can't find the container with id 3fe0dedab5dcb956806c5217127317994d44ee7986489fee455bbf25089df93e Apr 24 21:28:49.361524 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:49.361502 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf6a1a97_e9a0_4091_b077_931e1415d0c5.slice/crio-71b28cf9b53c6cfed8fa4936a2131b3e5a499a725058eeee394a3ed054486c81 WatchSource:0}: Error finding container 71b28cf9b53c6cfed8fa4936a2131b3e5a499a725058eeee394a3ed054486c81: Status 404 returned error can't find the container with id 71b28cf9b53c6cfed8fa4936a2131b3e5a499a725058eeee394a3ed054486c81 Apr 24 21:28:49.362178 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:49.362158 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fc5cade_b0b3_414a_88b0_ae3c0348001f.slice/crio-c40265e1f4fc1ebc4f5b2df975424e53d6913d9e92cc2de3f81b648dc6030138 WatchSource:0}: Error finding container c40265e1f4fc1ebc4f5b2df975424e53d6913d9e92cc2de3f81b648dc6030138: Status 404 returned error can't find the container with id c40265e1f4fc1ebc4f5b2df975424e53d6913d9e92cc2de3f81b648dc6030138 Apr 24 21:28:49.383216 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:49.383171 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod244284fa_4acf_45db_bf3a_c7bcd19a6b80.slice/crio-40ce8580ff75625c1eaa78f2aec3f2c220c4b6be2dc6ddfd2db4c2a1324a3d83 WatchSource:0}: Error finding container 40ce8580ff75625c1eaa78f2aec3f2c220c4b6be2dc6ddfd2db4c2a1324a3d83: Status 404 returned error can't find the container with id 40ce8580ff75625c1eaa78f2aec3f2c220c4b6be2dc6ddfd2db4c2a1324a3d83 Apr 24 21:28:49.383969 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:28:49.383921 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a521b1a_3dde_4f1e_aa52_3728d09e9921.slice/crio-33da8e18c52649c68b4b6c9df9fc1a7992ef77df656f22591da71cbae64527f7 WatchSource:0}: Error finding container 33da8e18c52649c68b4b6c9df9fc1a7992ef77df656f22591da71cbae64527f7: Status 404 returned error can't find the container with id 33da8e18c52649c68b4b6c9df9fc1a7992ef77df656f22591da71cbae64527f7 Apr 24 21:28:49.618530 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:49.618496 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:23:47 +0000 UTC" deadline="2027-11-07 19:14:32.097660686 +0000 UTC" Apr 24 21:28:49.618530 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:49.618525 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13485h45m42.479138218s" Apr 24 21:28:49.723867 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:49.723754 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" event={"ID":"5a521b1a-3dde-4f1e-aa52-3728d09e9921","Type":"ContainerStarted","Data":"33da8e18c52649c68b4b6c9df9fc1a7992ef77df656f22591da71cbae64527f7"} Apr 24 21:28:49.729543 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:49.729499 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wn7sd" event={"ID":"bf6a1a97-e9a0-4091-b077-931e1415d0c5","Type":"ContainerStarted","Data":"71b28cf9b53c6cfed8fa4936a2131b3e5a499a725058eeee394a3ed054486c81"} Apr 24 21:28:49.733205 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:49.733161 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" event={"ID":"3e1b294d-b645-40e3-b659-41031123c7f2","Type":"ContainerStarted","Data":"c53a1e4114dff5322cf4ea998b3207a57d79c4ded05db4d9d6c2cfe46b309439"} Apr 24 21:28:49.735440 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:49.735396 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qqsnz" event={"ID":"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0","Type":"ContainerStarted","Data":"e047f3eb0099157fc37010938135a6209691ee2c9fa3db9af91f564a3d3730c6"} Apr 24 21:28:49.741252 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:49.741197 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" event={"ID":"244284fa-4acf-45db-bf3a-c7bcd19a6b80","Type":"ContainerStarted","Data":"40ce8580ff75625c1eaa78f2aec3f2c220c4b6be2dc6ddfd2db4c2a1324a3d83"} Apr 24 21:28:49.745081 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:49.745056 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cdrzz" event={"ID":"2fc5cade-b0b3-414a-88b0-ae3c0348001f","Type":"ContainerStarted","Data":"c40265e1f4fc1ebc4f5b2df975424e53d6913d9e92cc2de3f81b648dc6030138"} Apr 24 21:28:49.755422 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:49.755399 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q8sjc" event={"ID":"f8b3ba0e-889f-4f1c-9e20-33df1e811158","Type":"ContainerStarted","Data":"3fe0dedab5dcb956806c5217127317994d44ee7986489fee455bbf25089df93e"} Apr 24 21:28:49.758156 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:49.758026 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cwvxk" event={"ID":"da82016d-3774-4430-881a-6479d2a7aa8c","Type":"ContainerStarted","Data":"a51706432ce88cd1a812768bf31ddea30fabe433c2ad8ffc2c2c84c1a8871e81"} Apr 24 21:28:49.763211 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:49.761450 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-d8cnv" event={"ID":"39fba077-f532-47c2-b634-29e01862bef6","Type":"ContainerStarted","Data":"377738b0e9540495f43b1ca8962d8ff0366b6ef19ef1d5be928fbd0858c3bebc"} Apr 24 21:28:49.766320 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:49.766177 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-162.ec2.internal" event={"ID":"a49fc3c8bc7b82f59a5e7858c439f11d","Type":"ContainerStarted","Data":"caaf44f360a1ee8197698976e25936b56ba0d3677117bd3bc691b15a08a731a5"} Apr 24 21:28:50.205530 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:50.204542 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs\") pod \"network-metrics-daemon-tdnnb\" (UID: \"fba6f53a-a544-4d53-ba11-2dd3b3259ed0\") " pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:28:50.205530 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:50.204720 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:50.205530 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:50.204780 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs podName:fba6f53a-a544-4d53-ba11-2dd3b3259ed0 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:52.204760743 +0000 UTC m=+6.075738225 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs") pod "network-metrics-daemon-tdnnb" (UID: "fba6f53a-a544-4d53-ba11-2dd3b3259ed0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:50.305276 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:50.305238 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2j2qz\" (UniqueName: \"kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz\") pod \"network-check-target-vtshd\" (UID: \"ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374\") " pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:28:50.305449 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:50.305433 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:50.305512 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:50.305451 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:50.305512 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:50.305464 2574 projected.go:194] Error preparing data for projected volume kube-api-access-2j2qz for pod openshift-network-diagnostics/network-check-target-vtshd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:50.305617 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:50.305527 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz podName:ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:52.305508333 +0000 UTC m=+6.176485815 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2j2qz" (UniqueName: "kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz") pod "network-check-target-vtshd" (UID: "ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:50.714508 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:50.714307 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:28:50.714508 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:50.714307 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:28:50.714508 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:50.714428 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vtshd" podUID="ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374" Apr 24 21:28:50.714508 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:50.714506 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnnb" podUID="fba6f53a-a544-4d53-ba11-2dd3b3259ed0" Apr 24 21:28:50.785460 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:50.785380 2574 generic.go:358] "Generic (PLEG): container finished" podID="2bbc7de608d73d2f4b55b275589c89ee" containerID="514eefdbd4842d04295d0a7d762c3c4fcb6ba059b65e52377b3a7a402c071b77" exitCode=0 Apr 24 21:28:50.786414 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:50.786356 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal" event={"ID":"2bbc7de608d73d2f4b55b275589c89ee","Type":"ContainerDied","Data":"514eefdbd4842d04295d0a7d762c3c4fcb6ba059b65e52377b3a7a402c071b77"} Apr 24 21:28:50.802980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:50.802494 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-162.ec2.internal" podStartSLOduration=2.802477744 podStartE2EDuration="2.802477744s" podCreationTimestamp="2026-04-24 21:28:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:49.779933408 +0000 UTC m=+3.650910908" watchObservedRunningTime="2026-04-24 21:28:50.802477744 +0000 UTC m=+4.673455244" Apr 24 21:28:51.793753 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:51.793063 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal" event={"ID":"2bbc7de608d73d2f4b55b275589c89ee","Type":"ContainerStarted","Data":"713e8016cd59fda51cd7602796f5e54acfab7e840ce0c46c36555787d1860d37"} Apr 24 21:28:52.219425 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:52.218671 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs\") pod \"network-metrics-daemon-tdnnb\" (UID: \"fba6f53a-a544-4d53-ba11-2dd3b3259ed0\") " pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:28:52.219425 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:52.218854 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:52.219425 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:52.218919 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs podName:fba6f53a-a544-4d53-ba11-2dd3b3259ed0 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:56.218898494 +0000 UTC m=+10.089875972 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs") pod "network-metrics-daemon-tdnnb" (UID: "fba6f53a-a544-4d53-ba11-2dd3b3259ed0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:52.319587 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:52.319548 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2j2qz\" (UniqueName: \"kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz\") pod \"network-check-target-vtshd\" (UID: \"ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374\") " pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:28:52.319762 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:52.319730 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:52.319762 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:52.319753 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:52.319762 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:52.319769 2574 projected.go:194] Error preparing data for projected volume kube-api-access-2j2qz for pod openshift-network-diagnostics/network-check-target-vtshd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:52.319982 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:52.319821 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz podName:ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:56.319808425 +0000 UTC m=+10.190785902 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2j2qz" (UniqueName: "kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz") pod "network-check-target-vtshd" (UID: "ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:52.712031 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:52.711951 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:28:52.712031 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:52.711987 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:28:52.712259 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:52.712087 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vtshd" podUID="ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374" Apr 24 21:28:52.712259 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:52.712236 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnnb" podUID="fba6f53a-a544-4d53-ba11-2dd3b3259ed0" Apr 24 21:28:54.711343 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:54.711311 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:28:54.711793 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:54.711455 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vtshd" podUID="ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374" Apr 24 21:28:54.711793 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:54.711324 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:28:54.711793 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:54.711763 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnnb" podUID="fba6f53a-a544-4d53-ba11-2dd3b3259ed0" Apr 24 21:28:56.252888 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:56.252766 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs\") pod \"network-metrics-daemon-tdnnb\" (UID: \"fba6f53a-a544-4d53-ba11-2dd3b3259ed0\") " pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:28:56.253341 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:56.252944 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:56.253341 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:56.253030 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs podName:fba6f53a-a544-4d53-ba11-2dd3b3259ed0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:04.253010366 +0000 UTC m=+18.123987858 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs") pod "network-metrics-daemon-tdnnb" (UID: "fba6f53a-a544-4d53-ba11-2dd3b3259ed0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:56.354214 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:56.353640 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2j2qz\" (UniqueName: \"kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz\") pod \"network-check-target-vtshd\" (UID: \"ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374\") " pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:28:56.354214 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:56.353801 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:56.354214 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:56.353818 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:56.354214 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:56.353830 2574 projected.go:194] Error preparing data for projected volume kube-api-access-2j2qz for pod openshift-network-diagnostics/network-check-target-vtshd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:56.354214 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:56.353882 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz podName:ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:04.35386526 +0000 UTC m=+18.224842753 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-2j2qz" (UniqueName: "kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz") pod "network-check-target-vtshd" (UID: "ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:56.713478 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:56.712985 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:28:56.713478 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:56.713096 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnnb" podUID="fba6f53a-a544-4d53-ba11-2dd3b3259ed0" Apr 24 21:28:56.713478 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:56.713282 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:28:56.713478 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:56.713402 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vtshd" podUID="ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374" Apr 24 21:28:58.705434 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:58.705330 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-162.ec2.internal" podStartSLOduration=10.70531361 podStartE2EDuration="10.70531361s" podCreationTimestamp="2026-04-24 21:28:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:51.810407611 +0000 UTC m=+5.681385124" watchObservedRunningTime="2026-04-24 21:28:58.70531361 +0000 UTC m=+12.576291108" Apr 24 21:28:58.705858 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:58.705834 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-4xgsz"] Apr 24 21:28:58.708542 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:58.708524 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:28:58.708646 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:58.708597 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4xgsz" podUID="f1b28df9-d260-40f6-ba3b-63772a458eeb" Apr 24 21:28:58.712087 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:58.712058 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:28:58.712203 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:58.712161 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vtshd" podUID="ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374" Apr 24 21:28:58.712203 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:58.712191 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:28:58.712316 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:58.712303 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnnb" podUID="fba6f53a-a544-4d53-ba11-2dd3b3259ed0" Apr 24 21:28:58.769857 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:58.769828 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret\") pod \"global-pull-secret-syncer-4xgsz\" (UID: \"f1b28df9-d260-40f6-ba3b-63772a458eeb\") " pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:28:58.770011 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:58.769874 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f1b28df9-d260-40f6-ba3b-63772a458eeb-kubelet-config\") pod \"global-pull-secret-syncer-4xgsz\" (UID: \"f1b28df9-d260-40f6-ba3b-63772a458eeb\") " pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:28:58.770011 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:58.769970 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f1b28df9-d260-40f6-ba3b-63772a458eeb-dbus\") pod \"global-pull-secret-syncer-4xgsz\" (UID: \"f1b28df9-d260-40f6-ba3b-63772a458eeb\") " pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:28:58.870602 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:58.870561 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret\") pod \"global-pull-secret-syncer-4xgsz\" (UID: \"f1b28df9-d260-40f6-ba3b-63772a458eeb\") " pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:28:58.870602 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:58.870608 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f1b28df9-d260-40f6-ba3b-63772a458eeb-kubelet-config\") pod \"global-pull-secret-syncer-4xgsz\" (UID: \"f1b28df9-d260-40f6-ba3b-63772a458eeb\") " pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:28:58.870831 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:58.870675 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f1b28df9-d260-40f6-ba3b-63772a458eeb-dbus\") pod \"global-pull-secret-syncer-4xgsz\" (UID: \"f1b28df9-d260-40f6-ba3b-63772a458eeb\") " pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:28:58.870831 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:58.870713 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:58.870831 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:58.870772 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret podName:f1b28df9-d260-40f6-ba3b-63772a458eeb nodeName:}" failed. No retries permitted until 2026-04-24 21:28:59.370757857 +0000 UTC m=+13.241735335 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret") pod "global-pull-secret-syncer-4xgsz" (UID: "f1b28df9-d260-40f6-ba3b-63772a458eeb") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:58.870831 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:58.870776 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f1b28df9-d260-40f6-ba3b-63772a458eeb-kubelet-config\") pod \"global-pull-secret-syncer-4xgsz\" (UID: \"f1b28df9-d260-40f6-ba3b-63772a458eeb\") " pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:28:58.870831 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:58.870828 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f1b28df9-d260-40f6-ba3b-63772a458eeb-dbus\") pod \"global-pull-secret-syncer-4xgsz\" (UID: \"f1b28df9-d260-40f6-ba3b-63772a458eeb\") " pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:28:59.375435 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:28:59.375397 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret\") pod \"global-pull-secret-syncer-4xgsz\" (UID: \"f1b28df9-d260-40f6-ba3b-63772a458eeb\") " pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:28:59.375599 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:59.375528 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:59.375599 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:28:59.375580 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret podName:f1b28df9-d260-40f6-ba3b-63772a458eeb nodeName:}" failed. No retries permitted until 2026-04-24 21:29:00.375567491 +0000 UTC m=+14.246544974 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret") pod "global-pull-secret-syncer-4xgsz" (UID: "f1b28df9-d260-40f6-ba3b-63772a458eeb") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:29:00.383572 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:00.383536 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret\") pod \"global-pull-secret-syncer-4xgsz\" (UID: \"f1b28df9-d260-40f6-ba3b-63772a458eeb\") " pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:29:00.383943 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:00.383699 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:29:00.383943 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:00.383765 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret podName:f1b28df9-d260-40f6-ba3b-63772a458eeb nodeName:}" failed. No retries permitted until 2026-04-24 21:29:02.383748464 +0000 UTC m=+16.254725946 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret") pod "global-pull-secret-syncer-4xgsz" (UID: "f1b28df9-d260-40f6-ba3b-63772a458eeb") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:29:00.711930 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:00.711836 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:29:00.711930 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:00.711839 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:29:00.711930 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:00.711897 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:29:00.712141 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:00.712002 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnnb" podUID="fba6f53a-a544-4d53-ba11-2dd3b3259ed0" Apr 24 21:29:00.712141 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:00.712084 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4xgsz" podUID="f1b28df9-d260-40f6-ba3b-63772a458eeb" Apr 24 21:29:00.712320 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:00.712138 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vtshd" podUID="ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374" Apr 24 21:29:02.400749 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:02.400712 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret\") pod \"global-pull-secret-syncer-4xgsz\" (UID: \"f1b28df9-d260-40f6-ba3b-63772a458eeb\") " pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:29:02.401252 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:02.400829 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:29:02.401252 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:02.400900 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret podName:f1b28df9-d260-40f6-ba3b-63772a458eeb nodeName:}" failed. No retries permitted until 2026-04-24 21:29:06.400878346 +0000 UTC m=+20.271855827 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret") pod "global-pull-secret-syncer-4xgsz" (UID: "f1b28df9-d260-40f6-ba3b-63772a458eeb") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:29:02.711981 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:02.711896 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:29:02.711981 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:02.711923 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:29:02.711981 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:02.711907 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:29:02.712206 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:02.712011 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4xgsz" podUID="f1b28df9-d260-40f6-ba3b-63772a458eeb" Apr 24 21:29:02.712206 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:02.712067 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vtshd" podUID="ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374" Apr 24 21:29:02.712206 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:02.712127 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnnb" podUID="fba6f53a-a544-4d53-ba11-2dd3b3259ed0" Apr 24 21:29:04.315482 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:04.315442 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs\") pod \"network-metrics-daemon-tdnnb\" (UID: \"fba6f53a-a544-4d53-ba11-2dd3b3259ed0\") " pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:29:04.315907 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:04.315625 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:29:04.315907 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:04.315698 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs podName:fba6f53a-a544-4d53-ba11-2dd3b3259ed0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:20.315678143 +0000 UTC m=+34.186655635 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs") pod "network-metrics-daemon-tdnnb" (UID: "fba6f53a-a544-4d53-ba11-2dd3b3259ed0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:29:04.415802 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:04.415775 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2j2qz\" (UniqueName: \"kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz\") pod \"network-check-target-vtshd\" (UID: \"ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374\") " pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:29:04.415991 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:04.415970 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:29:04.416082 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:04.415998 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:29:04.416082 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:04.416010 2574 projected.go:194] Error preparing data for projected volume kube-api-access-2j2qz for pod openshift-network-diagnostics/network-check-target-vtshd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:29:04.416082 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:04.416067 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz podName:ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:20.416047385 +0000 UTC m=+34.287024877 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-2j2qz" (UniqueName: "kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz") pod "network-check-target-vtshd" (UID: "ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:29:04.711668 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:04.711596 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:29:04.711830 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:04.711597 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:29:04.711830 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:04.711722 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vtshd" podUID="ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374" Apr 24 21:29:04.711830 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:04.711813 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnnb" podUID="fba6f53a-a544-4d53-ba11-2dd3b3259ed0" Apr 24 21:29:04.711980 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:04.711597 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:29:04.711980 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:04.711910 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4xgsz" podUID="f1b28df9-d260-40f6-ba3b-63772a458eeb" Apr 24 21:29:06.435353 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.433072 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret\") pod \"global-pull-secret-syncer-4xgsz\" (UID: \"f1b28df9-d260-40f6-ba3b-63772a458eeb\") " pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:29:06.435353 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:06.433379 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:29:06.435353 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:06.433434 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret podName:f1b28df9-d260-40f6-ba3b-63772a458eeb nodeName:}" failed. No retries permitted until 2026-04-24 21:29:14.433415805 +0000 UTC m=+28.304393282 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret") pod "global-pull-secret-syncer-4xgsz" (UID: "f1b28df9-d260-40f6-ba3b-63772a458eeb") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:29:06.712256 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.712217 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:29:06.712352 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:06.712306 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4xgsz" podUID="f1b28df9-d260-40f6-ba3b-63772a458eeb" Apr 24 21:29:06.712352 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.712325 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:29:06.712352 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.712339 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:29:06.712516 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:06.712417 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vtshd" podUID="ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374" Apr 24 21:29:06.712516 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:06.712492 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnnb" podUID="fba6f53a-a544-4d53-ba11-2dd3b3259ed0" Apr 24 21:29:06.818970 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.818955 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovn-acl-logging/0.log" Apr 24 21:29:06.819278 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.819205 2574 generic.go:358] "Generic (PLEG): container finished" podID="3e1b294d-b645-40e3-b659-41031123c7f2" containerID="02f18dfc3dfd47fe5dba0053ffae9c034b1cec31021a5899fa8317bccf560f2c" exitCode=1 Apr 24 21:29:06.819278 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.819249 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" event={"ID":"3e1b294d-b645-40e3-b659-41031123c7f2","Type":"ContainerDied","Data":"02f18dfc3dfd47fe5dba0053ffae9c034b1cec31021a5899fa8317bccf560f2c"} Apr 24 21:29:06.819393 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.819279 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" event={"ID":"3e1b294d-b645-40e3-b659-41031123c7f2","Type":"ContainerStarted","Data":"f493cb893bf975f2409a042aec16a2b23bec809b31374f0f51e3e354218a7f86"} Apr 24 21:29:06.820338 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.820321 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qqsnz" event={"ID":"4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0","Type":"ContainerStarted","Data":"e0b6e71f798431b312378b958fb4ae2c57c90c5ff4c189979291701798f45a4b"} Apr 24 21:29:06.821387 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.821367 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" event={"ID":"244284fa-4acf-45db-bf3a-c7bcd19a6b80","Type":"ContainerStarted","Data":"3b23408c30ccb0e2c7a37112dd3fdc4c6aa13c36b5d76f108b1ce60520df89a7"} Apr 24 21:29:06.822492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.822471 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cdrzz" event={"ID":"2fc5cade-b0b3-414a-88b0-ae3c0348001f","Type":"ContainerStarted","Data":"7e52b9f478f32041e12ce80acbf70aa17188bec0bad1234b2eaedcc1fbaa0b19"} Apr 24 21:29:06.823734 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.823714 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q8sjc" event={"ID":"f8b3ba0e-889f-4f1c-9e20-33df1e811158","Type":"ContainerStarted","Data":"a65065ae14513bc657a994f470608284d0aee33506d7f488ad3a80895b1f01ae"} Apr 24 21:29:06.824857 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.824839 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cwvxk" event={"ID":"da82016d-3774-4430-881a-6479d2a7aa8c","Type":"ContainerStarted","Data":"ae5b5642bbb074a3eb044e45e555c8cd20f06123d5e1928dc1e29c5f6a8344a2"} Apr 24 21:29:06.827876 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.827853 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" event={"ID":"5a521b1a-3dde-4f1e-aa52-3728d09e9921","Type":"ContainerStarted","Data":"db6ba01a9d5f152c07968f9aff012217cdaa6b01d3347f800eb57f83d53d08a7"} Apr 24 21:29:06.829057 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.829038 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wn7sd" event={"ID":"bf6a1a97-e9a0-4091-b077-931e1415d0c5","Type":"ContainerStarted","Data":"1b00b0bdc51533c76335f68977ec7cb597b86a0c135d7ccd01fb85da3d6bfd3e"} Apr 24 21:29:06.842405 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.842366 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qqsnz" podStartSLOduration=3.8459205450000002 podStartE2EDuration="20.842356886s" podCreationTimestamp="2026-04-24 21:28:46 +0000 UTC" firstStartedPulling="2026-04-24 21:28:49.358184298 +0000 UTC m=+3.229161775" lastFinishedPulling="2026-04-24 21:29:06.354620628 +0000 UTC m=+20.225598116" observedRunningTime="2026-04-24 21:29:06.842290682 +0000 UTC m=+20.713268181" watchObservedRunningTime="2026-04-24 21:29:06.842356886 +0000 UTC m=+20.713334386" Apr 24 21:29:06.862577 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.862544 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cwvxk" podStartSLOduration=3.890284669 podStartE2EDuration="20.862534524s" podCreationTimestamp="2026-04-24 21:28:46 +0000 UTC" firstStartedPulling="2026-04-24 21:28:49.360999612 +0000 UTC m=+3.231977094" lastFinishedPulling="2026-04-24 21:29:06.333249457 +0000 UTC m=+20.204226949" observedRunningTime="2026-04-24 21:29:06.862307801 +0000 UTC m=+20.733285302" watchObservedRunningTime="2026-04-24 21:29:06.862534524 +0000 UTC m=+20.733512022" Apr 24 21:29:06.931279 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.931217 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-q8sjc" podStartSLOduration=4.006525466 podStartE2EDuration="20.931203947s" podCreationTimestamp="2026-04-24 21:28:46 +0000 UTC" firstStartedPulling="2026-04-24 21:28:49.381998744 +0000 UTC m=+3.252976224" lastFinishedPulling="2026-04-24 21:29:06.306677213 +0000 UTC m=+20.177654705" observedRunningTime="2026-04-24 21:29:06.912332984 +0000 UTC m=+20.783310482" watchObservedRunningTime="2026-04-24 21:29:06.931203947 +0000 UTC m=+20.802181445" Apr 24 21:29:06.931383 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:06.931313 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-fkzj2" podStartSLOduration=4.010763483 podStartE2EDuration="20.931309204s" podCreationTimestamp="2026-04-24 21:28:46 +0000 UTC" firstStartedPulling="2026-04-24 21:28:49.38605727 +0000 UTC m=+3.257034748" lastFinishedPulling="2026-04-24 21:29:06.306602988 +0000 UTC m=+20.177580469" observedRunningTime="2026-04-24 21:29:06.930738098 +0000 UTC m=+20.801715597" watchObservedRunningTime="2026-04-24 21:29:06.931309204 +0000 UTC m=+20.802286702" Apr 24 21:29:07.638431 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:07.638408 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:29:07.644491 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:07.644412 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:29:07.63842603Z","UUID":"1e16c2d8-3b4c-44c0-9306-d5c10ee0f3bd","Handler":null,"Name":"","Endpoint":""} Apr 24 21:29:07.645885 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:07.645869 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:29:07.645973 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:07.645890 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:29:07.832161 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:07.832099 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-d8cnv" event={"ID":"39fba077-f532-47c2-b634-29e01862bef6","Type":"ContainerStarted","Data":"c94c9c316b58f91a9466f5dfe28fd894130baa3e5a9f9e3dedd5cd203472f23a"} Apr 24 21:29:07.833403 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:07.833382 2574 generic.go:358] "Generic (PLEG): container finished" podID="bf6a1a97-e9a0-4091-b077-931e1415d0c5" containerID="1b00b0bdc51533c76335f68977ec7cb597b86a0c135d7ccd01fb85da3d6bfd3e" exitCode=0 Apr 24 21:29:07.833484 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:07.833461 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wn7sd" event={"ID":"bf6a1a97-e9a0-4091-b077-931e1415d0c5","Type":"ContainerDied","Data":"1b00b0bdc51533c76335f68977ec7cb597b86a0c135d7ccd01fb85da3d6bfd3e"} Apr 24 21:29:07.835969 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:07.835888 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovn-acl-logging/0.log" Apr 24 21:29:07.836266 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:07.836246 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" event={"ID":"3e1b294d-b645-40e3-b659-41031123c7f2","Type":"ContainerStarted","Data":"137d1f9f685ce5015c4ae112a4c50d678cbd31158e55f70131d19512341909d8"} Apr 24 21:29:07.836326 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:07.836270 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" event={"ID":"3e1b294d-b645-40e3-b659-41031123c7f2","Type":"ContainerStarted","Data":"a9ac0928306e1c025665d41aa3e156558c5fe8f368235adf6bc5bf503e7f59a3"} Apr 24 21:29:07.836326 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:07.836281 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" event={"ID":"3e1b294d-b645-40e3-b659-41031123c7f2","Type":"ContainerStarted","Data":"73464d94a7c2901bb3fd9d22cda7d9f5a076709c95e8477dfcc5ab3f6f98be8f"} Apr 24 21:29:07.836326 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:07.836289 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" event={"ID":"3e1b294d-b645-40e3-b659-41031123c7f2","Type":"ContainerStarted","Data":"9ff374218800e09e4b6cf47eca7a31e4124ca1ab2113b01f7b5bba6ef7fa2dc6"} Apr 24 21:29:07.837812 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:07.837789 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" event={"ID":"244284fa-4acf-45db-bf3a-c7bcd19a6b80","Type":"ContainerStarted","Data":"14953b50b6058f8312a2a351fd2d7217deada8dfca69d1a278587c1b214cf438"} Apr 24 21:29:07.848788 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:07.848754 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cdrzz" podStartSLOduration=4.924099829 podStartE2EDuration="21.848744643s" podCreationTimestamp="2026-04-24 21:28:46 +0000 UTC" firstStartedPulling="2026-04-24 21:28:49.381943066 +0000 UTC m=+3.252920543" lastFinishedPulling="2026-04-24 21:29:06.306587867 +0000 UTC m=+20.177565357" observedRunningTime="2026-04-24 21:29:06.952294928 +0000 UTC m=+20.823272438" watchObservedRunningTime="2026-04-24 21:29:07.848744643 +0000 UTC m=+21.719722145" Apr 24 21:29:07.849034 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:07.849009 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-d8cnv" podStartSLOduration=4.898226981 podStartE2EDuration="21.849002576s" podCreationTimestamp="2026-04-24 21:28:46 +0000 UTC" firstStartedPulling="2026-04-24 21:28:49.355816856 +0000 UTC m=+3.226794336" lastFinishedPulling="2026-04-24 21:29:06.306592445 +0000 UTC m=+20.177569931" observedRunningTime="2026-04-24 21:29:07.848733704 +0000 UTC m=+21.719711200" watchObservedRunningTime="2026-04-24 21:29:07.849002576 +0000 UTC m=+21.719980096" Apr 24 21:29:08.711810 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:08.711566 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:29:08.712297 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:08.711920 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vtshd" podUID="ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374" Apr 24 21:29:08.712297 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:08.711570 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:29:08.712297 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:08.711568 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:29:08.712297 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:08.712017 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4xgsz" podUID="f1b28df9-d260-40f6-ba3b-63772a458eeb" Apr 24 21:29:08.712297 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:08.712118 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnnb" podUID="fba6f53a-a544-4d53-ba11-2dd3b3259ed0" Apr 24 21:29:08.985502 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:08.985435 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-q8sjc" Apr 24 21:29:09.000614 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:08.999905 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-q8sjc" Apr 24 21:29:09.843437 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:09.843390 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" event={"ID":"244284fa-4acf-45db-bf3a-c7bcd19a6b80","Type":"ContainerStarted","Data":"d0e27d6ef189a4945326261032112576811cb5b8753af2d88c642230f7055d16"} Apr 24 21:29:09.846406 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:09.846374 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovn-acl-logging/0.log" Apr 24 21:29:09.846742 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:09.846720 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" event={"ID":"3e1b294d-b645-40e3-b659-41031123c7f2","Type":"ContainerStarted","Data":"dbf907b99fe0fc2ea7cd4079dc0f42d2227abd18730acb64c22ec8960e0734fc"} Apr 24 21:29:09.862630 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:09.862576 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j2fts" podStartSLOduration=4.557074093 podStartE2EDuration="23.862563775s" podCreationTimestamp="2026-04-24 21:28:46 +0000 UTC" firstStartedPulling="2026-04-24 21:28:49.386128215 +0000 UTC m=+3.257105696" lastFinishedPulling="2026-04-24 21:29:08.691617886 +0000 UTC m=+22.562595378" observedRunningTime="2026-04-24 21:29:09.862050633 +0000 UTC m=+23.733028143" watchObservedRunningTime="2026-04-24 21:29:09.862563775 +0000 UTC m=+23.733541253" Apr 24 21:29:10.711456 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:10.711407 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:29:10.711456 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:10.711424 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:29:10.711680 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:10.711520 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vtshd" podUID="ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374" Apr 24 21:29:10.711680 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:10.711567 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnnb" podUID="fba6f53a-a544-4d53-ba11-2dd3b3259ed0" Apr 24 21:29:10.711680 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:10.711609 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:29:10.711811 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:10.711680 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4xgsz" podUID="f1b28df9-d260-40f6-ba3b-63772a458eeb" Apr 24 21:29:12.711702 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:12.711517 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:29:12.712198 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:12.711516 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:29:12.712198 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:12.711778 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vtshd" podUID="ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374" Apr 24 21:29:12.712198 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:12.711522 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:29:12.712198 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:12.711855 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4xgsz" podUID="f1b28df9-d260-40f6-ba3b-63772a458eeb" Apr 24 21:29:12.712198 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:12.711922 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnnb" podUID="fba6f53a-a544-4d53-ba11-2dd3b3259ed0" Apr 24 21:29:12.854934 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:12.854907 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovn-acl-logging/0.log" Apr 24 21:29:12.855293 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:12.855259 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" event={"ID":"3e1b294d-b645-40e3-b659-41031123c7f2","Type":"ContainerStarted","Data":"d8ed5dce770be5cd8153c715592dffc9ac759a7bc1ed9af21d841189ea0dfa0a"} Apr 24 21:29:12.855592 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:12.855568 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:29:12.855779 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:12.855752 2574 scope.go:117] "RemoveContainer" containerID="02f18dfc3dfd47fe5dba0053ffae9c034b1cec31021a5899fa8317bccf560f2c" Apr 24 21:29:12.857156 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:12.857131 2574 generic.go:358] "Generic (PLEG): container finished" podID="bf6a1a97-e9a0-4091-b077-931e1415d0c5" containerID="cd48f927be24e2afd997365849f643082e5a553a8b83866cf0d9569507d4fb61" exitCode=0 Apr 24 21:29:12.857266 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:12.857165 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wn7sd" event={"ID":"bf6a1a97-e9a0-4091-b077-931e1415d0c5","Type":"ContainerDied","Data":"cd48f927be24e2afd997365849f643082e5a553a8b83866cf0d9569507d4fb61"} Apr 24 21:29:12.869071 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:12.869039 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-q8sjc" Apr 24 21:29:12.869175 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:12.869156 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:29:12.869578 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:12.869557 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-q8sjc" Apr 24 21:29:12.872158 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:12.872137 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:29:13.853456 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:13.853266 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vtshd"] Apr 24 21:29:13.853828 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:13.853558 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:29:13.853828 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:13.853658 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vtshd" podUID="ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374" Apr 24 21:29:13.857284 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:13.857262 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4xgsz"] Apr 24 21:29:13.857401 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:13.857355 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:29:13.857459 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:13.857430 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4xgsz" podUID="f1b28df9-d260-40f6-ba3b-63772a458eeb" Apr 24 21:29:13.858556 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:13.858285 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tdnnb"] Apr 24 21:29:13.858556 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:13.858417 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:29:13.858556 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:13.858526 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnnb" podUID="fba6f53a-a544-4d53-ba11-2dd3b3259ed0" Apr 24 21:29:13.866089 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:13.864932 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovn-acl-logging/0.log" Apr 24 21:29:13.866434 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:13.866395 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" event={"ID":"3e1b294d-b645-40e3-b659-41031123c7f2","Type":"ContainerStarted","Data":"4fe7c68cb92c4e07b28a0074bffe7371cca8ed2316556a3fed2c4a25cd812c0d"} Apr 24 21:29:13.867082 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:13.867060 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:29:13.867366 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:13.867296 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:29:13.885813 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:13.885751 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:29:13.893141 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:13.892860 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" podStartSLOduration=10.872568481 podStartE2EDuration="27.892849788s" podCreationTimestamp="2026-04-24 21:28:46 +0000 UTC" firstStartedPulling="2026-04-24 21:28:49.362377219 +0000 UTC m=+3.233354695" lastFinishedPulling="2026-04-24 21:29:06.382658509 +0000 UTC m=+20.253636002" observedRunningTime="2026-04-24 21:29:13.892296836 +0000 UTC m=+27.763274334" watchObservedRunningTime="2026-04-24 21:29:13.892849788 +0000 UTC m=+27.763827279" Apr 24 21:29:14.496669 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:14.496640 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret\") pod \"global-pull-secret-syncer-4xgsz\" (UID: \"f1b28df9-d260-40f6-ba3b-63772a458eeb\") " pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:29:14.496845 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:14.496758 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:29:14.496845 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:14.496807 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret podName:f1b28df9-d260-40f6-ba3b-63772a458eeb nodeName:}" failed. No retries permitted until 2026-04-24 21:29:30.496792055 +0000 UTC m=+44.367769532 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret") pod "global-pull-secret-syncer-4xgsz" (UID: "f1b28df9-d260-40f6-ba3b-63772a458eeb") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:29:14.869912 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:14.869829 2574 generic.go:358] "Generic (PLEG): container finished" podID="bf6a1a97-e9a0-4091-b077-931e1415d0c5" containerID="bb99d79854450dfd6683971ec350dd44364da162b9eb711d25053be3951011da" exitCode=0 Apr 24 21:29:14.870273 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:14.869926 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wn7sd" event={"ID":"bf6a1a97-e9a0-4091-b077-931e1415d0c5","Type":"ContainerDied","Data":"bb99d79854450dfd6683971ec350dd44364da162b9eb711d25053be3951011da"} Apr 24 21:29:14.870273 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:14.870140 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:29:15.712092 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:15.712033 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:29:15.712208 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:15.712043 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:29:15.712208 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:15.712133 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4xgsz" podUID="f1b28df9-d260-40f6-ba3b-63772a458eeb" Apr 24 21:29:15.712208 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:15.712043 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:29:15.712343 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:15.712193 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vtshd" podUID="ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374" Apr 24 21:29:15.712343 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:15.712289 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnnb" podUID="fba6f53a-a544-4d53-ba11-2dd3b3259ed0" Apr 24 21:29:15.872113 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:15.872091 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:29:16.875592 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:16.875558 2574 generic.go:358] "Generic (PLEG): container finished" podID="bf6a1a97-e9a0-4091-b077-931e1415d0c5" containerID="871a367826e38fb81764f05ee8d0b629cc832d920baa43d330e41882fd8c2fbd" exitCode=0 Apr 24 21:29:16.876104 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:16.875612 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wn7sd" event={"ID":"bf6a1a97-e9a0-4091-b077-931e1415d0c5","Type":"ContainerDied","Data":"871a367826e38fb81764f05ee8d0b629cc832d920baa43d330e41882fd8c2fbd"} Apr 24 21:29:17.470085 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:17.469996 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:29:17.470255 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:17.470241 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:29:17.487864 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:17.487840 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-256cw" Apr 24 21:29:17.711964 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:17.711933 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:29:17.712122 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:17.711933 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:29:17.712122 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:17.712049 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4xgsz" podUID="f1b28df9-d260-40f6-ba3b-63772a458eeb" Apr 24 21:29:17.712122 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:17.711933 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:29:17.712278 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:17.712122 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vtshd" podUID="ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374" Apr 24 21:29:17.712278 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:17.712192 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tdnnb" podUID="fba6f53a-a544-4d53-ba11-2dd3b3259ed0" Apr 24 21:29:19.424394 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.424327 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-162.ec2.internal" event="NodeReady" Apr 24 21:29:19.424795 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.424443 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:29:19.461552 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.461525 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-65859fb559-pxvcf"] Apr 24 21:29:19.482699 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.482674 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jtvxc"] Apr 24 21:29:19.483004 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.482969 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.485444 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.485423 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:29:19.485557 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.485455 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:29:19.485621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.485568 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:29:19.485668 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.485622 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ptjx6\"" Apr 24 21:29:19.490408 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.490246 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:29:19.499933 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.499914 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-r8jwq"] Apr 24 21:29:19.500068 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.500049 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jtvxc" Apr 24 21:29:19.502141 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.502118 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-sbjvw\"" Apr 24 21:29:19.502533 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.502347 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:29:19.502533 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.502364 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:29:19.512417 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.512349 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-65859fb559-pxvcf"] Apr 24 21:29:19.512417 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.512371 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jtvxc"] Apr 24 21:29:19.512417 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.512385 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r8jwq"] Apr 24 21:29:19.512618 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.512481 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r8jwq" Apr 24 21:29:19.515444 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.515363 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:29:19.515763 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.515580 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:29:19.515763 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.515629 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:29:19.515928 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.515833 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kxf55\"" Apr 24 21:29:19.633586 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.633097 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.633586 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.633147 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-ca-trust-extracted\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.633586 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.633240 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-image-registry-private-configuration\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.633586 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.633290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-installation-pull-secrets\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.633586 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.633348 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3854f8f7-804b-4511-a3f3-1b96449f8b70-config-volume\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:29:19.633586 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.633398 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-certificates\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.633586 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.633522 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qllcs\" (UniqueName: \"kubernetes.io/projected/3854f8f7-804b-4511-a3f3-1b96449f8b70-kube-api-access-qllcs\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:29:19.633586 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.633558 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert\") pod \"ingress-canary-r8jwq\" (UID: \"0ee47bd0-7e68-48fd-8896-e4693d5e8f21\") " pod="openshift-ingress-canary/ingress-canary-r8jwq" Apr 24 21:29:19.633586 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.633589 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bqjs\" (UniqueName: \"kubernetes.io/projected/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-kube-api-access-9bqjs\") pod \"ingress-canary-r8jwq\" (UID: \"0ee47bd0-7e68-48fd-8896-e4693d5e8f21\") " pod="openshift-ingress-canary/ingress-canary-r8jwq" Apr 24 21:29:19.634133 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.633637 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4llr\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-kube-api-access-n4llr\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.634133 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.633663 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:29:19.634133 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.633695 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-trusted-ca\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.634133 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.633743 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-bound-sa-token\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.634133 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.633768 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3854f8f7-804b-4511-a3f3-1b96449f8b70-tmp-dir\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:29:19.711792 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.711707 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:29:19.711921 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.711707 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:29:19.711921 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.711707 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:29:19.714277 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.714255 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:29:19.714507 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.714480 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:29:19.714507 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.714501 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p6sxw\"" Apr 24 21:29:19.714662 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.714543 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:29:19.714662 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.714546 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-flqk8\"" Apr 24 21:29:19.714662 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.714587 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:29:19.734152 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.734127 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-installation-pull-secrets\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.734247 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.734163 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3854f8f7-804b-4511-a3f3-1b96449f8b70-config-volume\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:29:19.734247 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.734191 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-certificates\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.734397 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.734365 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qllcs\" (UniqueName: \"kubernetes.io/projected/3854f8f7-804b-4511-a3f3-1b96449f8b70-kube-api-access-qllcs\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:29:19.734465 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.734395 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert\") pod \"ingress-canary-r8jwq\" (UID: \"0ee47bd0-7e68-48fd-8896-e4693d5e8f21\") " pod="openshift-ingress-canary/ingress-canary-r8jwq" Apr 24 21:29:19.734465 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.734422 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bqjs\" (UniqueName: \"kubernetes.io/projected/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-kube-api-access-9bqjs\") pod \"ingress-canary-r8jwq\" (UID: \"0ee47bd0-7e68-48fd-8896-e4693d5e8f21\") " pod="openshift-ingress-canary/ingress-canary-r8jwq" Apr 24 21:29:19.734568 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.734467 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4llr\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-kube-api-access-n4llr\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.734568 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.734493 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:29:19.734568 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.734522 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-trusted-ca\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.734568 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:19.734553 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:19.734765 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:19.734643 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert podName:0ee47bd0-7e68-48fd-8896-e4693d5e8f21 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:20.234613219 +0000 UTC m=+34.105590704 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert") pod "ingress-canary-r8jwq" (UID: "0ee47bd0-7e68-48fd-8896-e4693d5e8f21") : secret "canary-serving-cert" not found Apr 24 21:29:19.734765 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.734558 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-bound-sa-token\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.734765 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.734681 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3854f8f7-804b-4511-a3f3-1b96449f8b70-tmp-dir\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:29:19.734765 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.734711 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.734765 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.734741 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-ca-trust-extracted\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.735013 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.734776 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-image-registry-private-configuration\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.735013 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.734782 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3854f8f7-804b-4511-a3f3-1b96449f8b70-config-volume\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:29:19.735013 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.734802 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-certificates\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.735013 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:19.734974 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:19.735196 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:19.735032 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls podName:3854f8f7-804b-4511-a3f3-1b96449f8b70 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:20.235014853 +0000 UTC m=+34.105992332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls") pod "dns-default-jtvxc" (UID: "3854f8f7-804b-4511-a3f3-1b96449f8b70") : secret "dns-default-metrics-tls" not found Apr 24 21:29:19.735196 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:19.735084 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:19.735196 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:19.735095 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65859fb559-pxvcf: secret "image-registry-tls" not found Apr 24 21:29:19.735196 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:19.735129 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls podName:263f8ce6-7a53-4ba0-808d-ac71652fdc4d nodeName:}" failed. No retries permitted until 2026-04-24 21:29:20.23511693 +0000 UTC m=+34.106094409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls") pod "image-registry-65859fb559-pxvcf" (UID: "263f8ce6-7a53-4ba0-808d-ac71652fdc4d") : secret "image-registry-tls" not found Apr 24 21:29:19.735415 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.735219 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3854f8f7-804b-4511-a3f3-1b96449f8b70-tmp-dir\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:29:19.735415 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.735272 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-ca-trust-extracted\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.735734 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.735714 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-trusted-ca\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.738710 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.738691 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-installation-pull-secrets\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.738807 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.738784 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-image-registry-private-configuration\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.752070 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.752041 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-bound-sa-token\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:19.752209 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.752114 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qllcs\" (UniqueName: \"kubernetes.io/projected/3854f8f7-804b-4511-a3f3-1b96449f8b70-kube-api-access-qllcs\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:29:19.752209 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.752187 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bqjs\" (UniqueName: \"kubernetes.io/projected/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-kube-api-access-9bqjs\") pod \"ingress-canary-r8jwq\" (UID: \"0ee47bd0-7e68-48fd-8896-e4693d5e8f21\") " pod="openshift-ingress-canary/ingress-canary-r8jwq" Apr 24 21:29:19.752678 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:19.752661 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4llr\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-kube-api-access-n4llr\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:20.239102 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:20.239068 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert\") pod \"ingress-canary-r8jwq\" (UID: \"0ee47bd0-7e68-48fd-8896-e4693d5e8f21\") " pod="openshift-ingress-canary/ingress-canary-r8jwq" Apr 24 21:29:20.239296 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:20.239126 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:29:20.239296 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:20.239183 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:20.239296 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:20.239256 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:20.239463 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:20.239304 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:20.239463 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:20.239318 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65859fb559-pxvcf: secret "image-registry-tls" not found Apr 24 21:29:20.239463 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:20.239323 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert podName:0ee47bd0-7e68-48fd-8896-e4693d5e8f21 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:21.239305603 +0000 UTC m=+35.110283100 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert") pod "ingress-canary-r8jwq" (UID: "0ee47bd0-7e68-48fd-8896-e4693d5e8f21") : secret "canary-serving-cert" not found Apr 24 21:29:20.239463 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:20.239359 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls podName:263f8ce6-7a53-4ba0-808d-ac71652fdc4d nodeName:}" failed. No retries permitted until 2026-04-24 21:29:21.239348236 +0000 UTC m=+35.110325713 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls") pod "image-registry-65859fb559-pxvcf" (UID: "263f8ce6-7a53-4ba0-808d-ac71652fdc4d") : secret "image-registry-tls" not found Apr 24 21:29:20.239463 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:20.239410 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:20.239463 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:20.239456 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls podName:3854f8f7-804b-4511-a3f3-1b96449f8b70 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:21.239443063 +0000 UTC m=+35.110420540 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls") pod "dns-default-jtvxc" (UID: "3854f8f7-804b-4511-a3f3-1b96449f8b70") : secret "dns-default-metrics-tls" not found Apr 24 21:29:20.339919 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:20.339858 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs\") pod \"network-metrics-daemon-tdnnb\" (UID: \"fba6f53a-a544-4d53-ba11-2dd3b3259ed0\") " pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:29:20.340086 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:20.340036 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:29:20.340153 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:20.340109 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs podName:fba6f53a-a544-4d53-ba11-2dd3b3259ed0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:52.340088573 +0000 UTC m=+66.211066058 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs") pod "network-metrics-daemon-tdnnb" (UID: "fba6f53a-a544-4d53-ba11-2dd3b3259ed0") : secret "metrics-daemon-secret" not found Apr 24 21:29:20.440398 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:20.440354 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2j2qz\" (UniqueName: \"kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz\") pod \"network-check-target-vtshd\" (UID: \"ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374\") " pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:29:20.442995 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:20.442973 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j2qz\" (UniqueName: \"kubernetes.io/projected/ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374-kube-api-access-2j2qz\") pod \"network-check-target-vtshd\" (UID: \"ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374\") " pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:29:20.629054 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:20.629023 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:29:21.246566 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:21.246532 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:21.246852 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:21.246599 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert\") pod \"ingress-canary-r8jwq\" (UID: \"0ee47bd0-7e68-48fd-8896-e4693d5e8f21\") " pod="openshift-ingress-canary/ingress-canary-r8jwq" Apr 24 21:29:21.246852 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:21.246642 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:29:21.246852 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:21.246691 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:21.246852 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:21.246704 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:21.246852 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:21.246722 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:21.246852 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:21.246765 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls podName:3854f8f7-804b-4511-a3f3-1b96449f8b70 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:23.246751597 +0000 UTC m=+37.117729077 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls") pod "dns-default-jtvxc" (UID: "3854f8f7-804b-4511-a3f3-1b96449f8b70") : secret "dns-default-metrics-tls" not found Apr 24 21:29:21.246852 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:21.246711 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65859fb559-pxvcf: secret "image-registry-tls" not found Apr 24 21:29:21.246852 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:21.246777 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert podName:0ee47bd0-7e68-48fd-8896-e4693d5e8f21 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:23.246771354 +0000 UTC m=+37.117748831 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert") pod "ingress-canary-r8jwq" (UID: "0ee47bd0-7e68-48fd-8896-e4693d5e8f21") : secret "canary-serving-cert" not found Apr 24 21:29:21.246852 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:21.246810 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls podName:263f8ce6-7a53-4ba0-808d-ac71652fdc4d nodeName:}" failed. No retries permitted until 2026-04-24 21:29:23.246791498 +0000 UTC m=+37.117768976 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls") pod "image-registry-65859fb559-pxvcf" (UID: "263f8ce6-7a53-4ba0-808d-ac71652fdc4d") : secret "image-registry-tls" not found Apr 24 21:29:22.413521 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:22.413494 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vtshd"] Apr 24 21:29:22.498210 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:29:22.498179 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab4b7ffb_2c75_4a6a_b8f4_287b1b3bb374.slice/crio-fc7a79c43cd563688753bd03f4d0f419f8b96dc00b5c6e76eb99c3f79574bbbc WatchSource:0}: Error finding container fc7a79c43cd563688753bd03f4d0f419f8b96dc00b5c6e76eb99c3f79574bbbc: Status 404 returned error can't find the container with id fc7a79c43cd563688753bd03f4d0f419f8b96dc00b5c6e76eb99c3f79574bbbc Apr 24 21:29:22.889767 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:22.889472 2574 generic.go:358] "Generic (PLEG): container finished" podID="bf6a1a97-e9a0-4091-b077-931e1415d0c5" containerID="59f1ee43c2b429865a595b4c1acf10f91e59d29a4ba90c1a3568322bae8c07ed" exitCode=0 Apr 24 21:29:22.889767 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:22.889559 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wn7sd" event={"ID":"bf6a1a97-e9a0-4091-b077-931e1415d0c5","Type":"ContainerDied","Data":"59f1ee43c2b429865a595b4c1acf10f91e59d29a4ba90c1a3568322bae8c07ed"} Apr 24 21:29:22.890930 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:22.890906 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vtshd" event={"ID":"ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374","Type":"ContainerStarted","Data":"fc7a79c43cd563688753bd03f4d0f419f8b96dc00b5c6e76eb99c3f79574bbbc"} Apr 24 21:29:23.264784 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:23.264736 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert\") pod \"ingress-canary-r8jwq\" (UID: \"0ee47bd0-7e68-48fd-8896-e4693d5e8f21\") " pod="openshift-ingress-canary/ingress-canary-r8jwq" Apr 24 21:29:23.264935 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:23.264798 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:29:23.264935 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:23.264837 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:23.264935 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:23.264895 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:23.265150 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:23.264953 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:23.265150 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:23.264967 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65859fb559-pxvcf: secret "image-registry-tls" not found Apr 24 21:29:23.265150 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:23.264975 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert podName:0ee47bd0-7e68-48fd-8896-e4693d5e8f21 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:27.264953145 +0000 UTC m=+41.135930636 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert") pod "ingress-canary-r8jwq" (UID: "0ee47bd0-7e68-48fd-8896-e4693d5e8f21") : secret "canary-serving-cert" not found Apr 24 21:29:23.265150 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:23.265014 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls podName:263f8ce6-7a53-4ba0-808d-ac71652fdc4d nodeName:}" failed. No retries permitted until 2026-04-24 21:29:27.264998928 +0000 UTC m=+41.135976406 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls") pod "image-registry-65859fb559-pxvcf" (UID: "263f8ce6-7a53-4ba0-808d-ac71652fdc4d") : secret "image-registry-tls" not found Apr 24 21:29:23.265150 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:23.264953 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:23.265150 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:23.265055 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls podName:3854f8f7-804b-4511-a3f3-1b96449f8b70 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:27.265045614 +0000 UTC m=+41.136023092 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls") pod "dns-default-jtvxc" (UID: "3854f8f7-804b-4511-a3f3-1b96449f8b70") : secret "dns-default-metrics-tls" not found Apr 24 21:29:23.896061 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:23.896024 2574 generic.go:358] "Generic (PLEG): container finished" podID="bf6a1a97-e9a0-4091-b077-931e1415d0c5" containerID="74c20a9d7b5217930c9b9a98ead1797b6e12e72402eb3d3de4b501913c43a63d" exitCode=0 Apr 24 21:29:23.896519 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:23.896079 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wn7sd" event={"ID":"bf6a1a97-e9a0-4091-b077-931e1415d0c5","Type":"ContainerDied","Data":"74c20a9d7b5217930c9b9a98ead1797b6e12e72402eb3d3de4b501913c43a63d"} Apr 24 21:29:24.901846 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:24.901816 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wn7sd" event={"ID":"bf6a1a97-e9a0-4091-b077-931e1415d0c5","Type":"ContainerStarted","Data":"4737c1e768b49f59b0738d0b4fcb3a7b12c70d0b44985700b86e8cc06a4b7b28"} Apr 24 21:29:24.925316 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:24.925265 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wn7sd" podStartSLOduration=5.77874061 podStartE2EDuration="38.925249117s" podCreationTimestamp="2026-04-24 21:28:46 +0000 UTC" firstStartedPulling="2026-04-24 21:28:49.381943489 +0000 UTC m=+3.252920983" lastFinishedPulling="2026-04-24 21:29:22.528452014 +0000 UTC m=+36.399429490" observedRunningTime="2026-04-24 21:29:24.923536441 +0000 UTC m=+38.794513940" watchObservedRunningTime="2026-04-24 21:29:24.925249117 +0000 UTC m=+38.796226620" Apr 24 21:29:25.905510 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:25.905477 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vtshd" event={"ID":"ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374","Type":"ContainerStarted","Data":"07e4568796b2b3064a3678d459d8f210d69f4ba0e974915d32abcbd59cc409ba"} Apr 24 21:29:25.905868 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:25.905832 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:29:25.921649 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:25.921612 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vtshd" podStartSLOduration=37.192524031 podStartE2EDuration="39.921600992s" podCreationTimestamp="2026-04-24 21:28:46 +0000 UTC" firstStartedPulling="2026-04-24 21:29:22.505551015 +0000 UTC m=+36.376528491" lastFinishedPulling="2026-04-24 21:29:25.23462796 +0000 UTC m=+39.105605452" observedRunningTime="2026-04-24 21:29:25.920742758 +0000 UTC m=+39.791720268" watchObservedRunningTime="2026-04-24 21:29:25.921600992 +0000 UTC m=+39.792578488" Apr 24 21:29:27.297835 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:27.297801 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert\") pod \"ingress-canary-r8jwq\" (UID: \"0ee47bd0-7e68-48fd-8896-e4693d5e8f21\") " pod="openshift-ingress-canary/ingress-canary-r8jwq" Apr 24 21:29:27.297835 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:27.297843 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:29:27.298385 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:27.297867 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:27.298385 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:27.297972 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:27.298385 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:27.298036 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls podName:3854f8f7-804b-4511-a3f3-1b96449f8b70 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:35.29801949 +0000 UTC m=+49.168996972 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls") pod "dns-default-jtvxc" (UID: "3854f8f7-804b-4511-a3f3-1b96449f8b70") : secret "dns-default-metrics-tls" not found Apr 24 21:29:27.298385 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:27.297974 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:27.298385 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:27.298079 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65859fb559-pxvcf: secret "image-registry-tls" not found Apr 24 21:29:27.298385 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:27.297973 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:27.298385 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:27.298146 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls podName:263f8ce6-7a53-4ba0-808d-ac71652fdc4d nodeName:}" failed. No retries permitted until 2026-04-24 21:29:35.298124047 +0000 UTC m=+49.169101529 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls") pod "image-registry-65859fb559-pxvcf" (UID: "263f8ce6-7a53-4ba0-808d-ac71652fdc4d") : secret "image-registry-tls" not found Apr 24 21:29:27.298385 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:27.298201 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert podName:0ee47bd0-7e68-48fd-8896-e4693d5e8f21 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:35.298182008 +0000 UTC m=+49.169159493 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert") pod "ingress-canary-r8jwq" (UID: "0ee47bd0-7e68-48fd-8896-e4693d5e8f21") : secret "canary-serving-cert" not found Apr 24 21:29:30.519099 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:30.519044 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret\") pod \"global-pull-secret-syncer-4xgsz\" (UID: \"f1b28df9-d260-40f6-ba3b-63772a458eeb\") " pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:29:30.522827 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:30.522797 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1b28df9-d260-40f6-ba3b-63772a458eeb-original-pull-secret\") pod \"global-pull-secret-syncer-4xgsz\" (UID: \"f1b28df9-d260-40f6-ba3b-63772a458eeb\") " pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:29:30.822568 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:30.822487 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4xgsz" Apr 24 21:29:30.939791 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:30.939760 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4xgsz"] Apr 24 21:29:30.943110 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:29:30.943082 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1b28df9_d260_40f6_ba3b_63772a458eeb.slice/crio-2069d335251584fa3f9b782ad731aaef95e3c40f3c1ea5c62ba0e270a2e00e5f WatchSource:0}: Error finding container 2069d335251584fa3f9b782ad731aaef95e3c40f3c1ea5c62ba0e270a2e00e5f: Status 404 returned error can't find the container with id 2069d335251584fa3f9b782ad731aaef95e3c40f3c1ea5c62ba0e270a2e00e5f Apr 24 21:29:31.917176 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:31.917137 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4xgsz" event={"ID":"f1b28df9-d260-40f6-ba3b-63772a458eeb","Type":"ContainerStarted","Data":"2069d335251584fa3f9b782ad731aaef95e3c40f3c1ea5c62ba0e270a2e00e5f"} Apr 24 21:29:34.924533 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:34.924496 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4xgsz" event={"ID":"f1b28df9-d260-40f6-ba3b-63772a458eeb","Type":"ContainerStarted","Data":"462045011b63a1a3a6d924722c7159df3bbb13294338e05b3dcb10e11890ffc9"} Apr 24 21:29:34.940270 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:34.940208 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-4xgsz" podStartSLOduration=33.508692787 podStartE2EDuration="36.940196225s" podCreationTimestamp="2026-04-24 21:28:58 +0000 UTC" firstStartedPulling="2026-04-24 21:29:30.945046505 +0000 UTC m=+44.816023981" lastFinishedPulling="2026-04-24 21:29:34.376549939 +0000 UTC m=+48.247527419" observedRunningTime="2026-04-24 21:29:34.93955633 +0000 UTC m=+48.810533974" watchObservedRunningTime="2026-04-24 21:29:34.940196225 +0000 UTC m=+48.811173724" Apr 24 21:29:35.354646 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:35.354564 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert\") pod \"ingress-canary-r8jwq\" (UID: \"0ee47bd0-7e68-48fd-8896-e4693d5e8f21\") " pod="openshift-ingress-canary/ingress-canary-r8jwq" Apr 24 21:29:35.354646 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:35.354630 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:29:35.354844 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:35.354669 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:35.354844 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:35.354723 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:35.354844 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:35.354785 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:35.354844 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:35.354798 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert podName:0ee47bd0-7e68-48fd-8896-e4693d5e8f21 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:51.354778453 +0000 UTC m=+65.225755930 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert") pod "ingress-canary-r8jwq" (UID: "0ee47bd0-7e68-48fd-8896-e4693d5e8f21") : secret "canary-serving-cert" not found Apr 24 21:29:35.354844 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:35.354789 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:35.355088 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:35.354861 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65859fb559-pxvcf: secret "image-registry-tls" not found Apr 24 21:29:35.355088 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:35.354848 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls podName:3854f8f7-804b-4511-a3f3-1b96449f8b70 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:51.354833211 +0000 UTC m=+65.225810704 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls") pod "dns-default-jtvxc" (UID: "3854f8f7-804b-4511-a3f3-1b96449f8b70") : secret "dns-default-metrics-tls" not found Apr 24 21:29:35.355088 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:35.354915 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls podName:263f8ce6-7a53-4ba0-808d-ac71652fdc4d nodeName:}" failed. No retries permitted until 2026-04-24 21:29:51.354902572 +0000 UTC m=+65.225880051 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls") pod "image-registry-65859fb559-pxvcf" (UID: "263f8ce6-7a53-4ba0-808d-ac71652fdc4d") : secret "image-registry-tls" not found Apr 24 21:29:51.362555 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:51.362517 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:29:51.363041 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:51.362580 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert\") pod \"ingress-canary-r8jwq\" (UID: \"0ee47bd0-7e68-48fd-8896-e4693d5e8f21\") " pod="openshift-ingress-canary/ingress-canary-r8jwq" Apr 24 21:29:51.363041 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:51.362674 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:51.363041 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:51.362676 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:51.363041 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:51.362697 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65859fb559-pxvcf: secret "image-registry-tls" not found Apr 24 21:29:51.363041 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:51.362722 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert podName:0ee47bd0-7e68-48fd-8896-e4693d5e8f21 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:23.362706885 +0000 UTC m=+97.233684361 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert") pod "ingress-canary-r8jwq" (UID: "0ee47bd0-7e68-48fd-8896-e4693d5e8f21") : secret "canary-serving-cert" not found Apr 24 21:29:51.363041 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:51.362752 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:29:51.363041 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:51.362764 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls podName:263f8ce6-7a53-4ba0-808d-ac71652fdc4d nodeName:}" failed. No retries permitted until 2026-04-24 21:30:23.362745476 +0000 UTC m=+97.233722955 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls") pod "image-registry-65859fb559-pxvcf" (UID: "263f8ce6-7a53-4ba0-808d-ac71652fdc4d") : secret "image-registry-tls" not found Apr 24 21:29:51.363041 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:51.362829 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:51.363041 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:51.362861 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls podName:3854f8f7-804b-4511-a3f3-1b96449f8b70 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:23.362853422 +0000 UTC m=+97.233830899 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls") pod "dns-default-jtvxc" (UID: "3854f8f7-804b-4511-a3f3-1b96449f8b70") : secret "dns-default-metrics-tls" not found Apr 24 21:29:52.368363 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:52.368327 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs\") pod \"network-metrics-daemon-tdnnb\" (UID: \"fba6f53a-a544-4d53-ba11-2dd3b3259ed0\") " pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:29:52.368767 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:52.368490 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:29:52.368767 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:29:52.368571 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs podName:fba6f53a-a544-4d53-ba11-2dd3b3259ed0 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:56.368554349 +0000 UTC m=+130.239531825 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs") pod "network-metrics-daemon-tdnnb" (UID: "fba6f53a-a544-4d53-ba11-2dd3b3259ed0") : secret "metrics-daemon-secret" not found Apr 24 21:29:56.399996 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.399958 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l"] Apr 24 21:29:56.402843 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.402827 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.405014 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.404994 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 21:29:56.405104 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.405088 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 21:29:56.405886 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.405863 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 21:29:56.405886 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.405872 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 21:29:56.406026 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.405875 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 21:29:56.406026 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.405942 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 21:29:56.406026 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.405984 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 21:29:56.415576 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.415556 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l"] Apr 24 21:29:56.494864 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.494840 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9cd8aae5-3752-411b-9b22-2c9688bb5914-ca\") pod \"cluster-proxy-proxy-agent-7c9b5b999d-9nl7l\" (UID: \"9cd8aae5-3752-411b-9b22-2c9688bb5914\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.494964 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.494880 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9cd8aae5-3752-411b-9b22-2c9688bb5914-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c9b5b999d-9nl7l\" (UID: \"9cd8aae5-3752-411b-9b22-2c9688bb5914\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.494964 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.494948 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwgsj\" (UniqueName: \"kubernetes.io/projected/9cd8aae5-3752-411b-9b22-2c9688bb5914-kube-api-access-hwgsj\") pod \"cluster-proxy-proxy-agent-7c9b5b999d-9nl7l\" (UID: \"9cd8aae5-3752-411b-9b22-2c9688bb5914\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.495033 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.494982 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9cd8aae5-3752-411b-9b22-2c9688bb5914-hub\") pod \"cluster-proxy-proxy-agent-7c9b5b999d-9nl7l\" (UID: \"9cd8aae5-3752-411b-9b22-2c9688bb5914\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.495033 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.495014 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9cd8aae5-3752-411b-9b22-2c9688bb5914-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c9b5b999d-9nl7l\" (UID: \"9cd8aae5-3752-411b-9b22-2c9688bb5914\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.495135 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.495036 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9cd8aae5-3752-411b-9b22-2c9688bb5914-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c9b5b999d-9nl7l\" (UID: \"9cd8aae5-3752-411b-9b22-2c9688bb5914\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.595989 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.595967 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9cd8aae5-3752-411b-9b22-2c9688bb5914-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c9b5b999d-9nl7l\" (UID: \"9cd8aae5-3752-411b-9b22-2c9688bb5914\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.596057 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.595992 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9cd8aae5-3752-411b-9b22-2c9688bb5914-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c9b5b999d-9nl7l\" (UID: \"9cd8aae5-3752-411b-9b22-2c9688bb5914\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.596057 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.596028 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9cd8aae5-3752-411b-9b22-2c9688bb5914-ca\") pod \"cluster-proxy-proxy-agent-7c9b5b999d-9nl7l\" (UID: \"9cd8aae5-3752-411b-9b22-2c9688bb5914\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.596124 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.596073 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9cd8aae5-3752-411b-9b22-2c9688bb5914-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c9b5b999d-9nl7l\" (UID: \"9cd8aae5-3752-411b-9b22-2c9688bb5914\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.596124 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.596100 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwgsj\" (UniqueName: \"kubernetes.io/projected/9cd8aae5-3752-411b-9b22-2c9688bb5914-kube-api-access-hwgsj\") pod \"cluster-proxy-proxy-agent-7c9b5b999d-9nl7l\" (UID: \"9cd8aae5-3752-411b-9b22-2c9688bb5914\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.596211 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.596135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9cd8aae5-3752-411b-9b22-2c9688bb5914-hub\") pod \"cluster-proxy-proxy-agent-7c9b5b999d-9nl7l\" (UID: \"9cd8aae5-3752-411b-9b22-2c9688bb5914\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.596723 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.596703 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9cd8aae5-3752-411b-9b22-2c9688bb5914-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c9b5b999d-9nl7l\" (UID: \"9cd8aae5-3752-411b-9b22-2c9688bb5914\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.598450 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.598429 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9cd8aae5-3752-411b-9b22-2c9688bb5914-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c9b5b999d-9nl7l\" (UID: \"9cd8aae5-3752-411b-9b22-2c9688bb5914\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.598766 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.598749 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9cd8aae5-3752-411b-9b22-2c9688bb5914-ca\") pod \"cluster-proxy-proxy-agent-7c9b5b999d-9nl7l\" (UID: \"9cd8aae5-3752-411b-9b22-2c9688bb5914\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.598889 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.598866 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9cd8aae5-3752-411b-9b22-2c9688bb5914-hub\") pod \"cluster-proxy-proxy-agent-7c9b5b999d-9nl7l\" (UID: \"9cd8aae5-3752-411b-9b22-2c9688bb5914\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.599136 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.599117 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9cd8aae5-3752-411b-9b22-2c9688bb5914-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c9b5b999d-9nl7l\" (UID: \"9cd8aae5-3752-411b-9b22-2c9688bb5914\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.609107 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.609088 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwgsj\" (UniqueName: \"kubernetes.io/projected/9cd8aae5-3752-411b-9b22-2c9688bb5914-kube-api-access-hwgsj\") pod \"cluster-proxy-proxy-agent-7c9b5b999d-9nl7l\" (UID: \"9cd8aae5-3752-411b-9b22-2c9688bb5914\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.722431 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.722380 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:29:56.838437 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.838410 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l"] Apr 24 21:29:56.841893 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:29:56.841865 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cd8aae5_3752_411b_9b22_2c9688bb5914.slice/crio-8e7c142da099a82784866461aa6278dd26693be9483f40d11b097e61a600772b WatchSource:0}: Error finding container 8e7c142da099a82784866461aa6278dd26693be9483f40d11b097e61a600772b: Status 404 returned error can't find the container with id 8e7c142da099a82784866461aa6278dd26693be9483f40d11b097e61a600772b Apr 24 21:29:56.967106 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:56.967076 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" event={"ID":"9cd8aae5-3752-411b-9b22-2c9688bb5914","Type":"ContainerStarted","Data":"8e7c142da099a82784866461aa6278dd26693be9483f40d11b097e61a600772b"} Apr 24 21:29:57.911955 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:57.911923 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vtshd" Apr 24 21:29:59.974140 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:29:59.974075 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" event={"ID":"9cd8aae5-3752-411b-9b22-2c9688bb5914","Type":"ContainerStarted","Data":"7411b7061de169ad368d8a45c3ae939c1fe56c76f0ddff8fd5300cc308b19f39"} Apr 24 21:30:01.979162 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:30:01.979128 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" event={"ID":"9cd8aae5-3752-411b-9b22-2c9688bb5914","Type":"ContainerStarted","Data":"96b95c57021f3706fac1e3852e1fb3e3729e72a43ed41762367be8e26fd63399"} Apr 24 21:30:01.979510 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:30:01.979171 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" event={"ID":"9cd8aae5-3752-411b-9b22-2c9688bb5914","Type":"ContainerStarted","Data":"0f7c13b421c569333d9e77939e72307bc605e69173294a1871a2a115667fe0e8"} Apr 24 21:30:01.998515 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:30:01.998470 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" podStartSLOduration=1.418083327 podStartE2EDuration="5.998455712s" podCreationTimestamp="2026-04-24 21:29:56 +0000 UTC" firstStartedPulling="2026-04-24 21:29:56.843568213 +0000 UTC m=+70.714545693" lastFinishedPulling="2026-04-24 21:30:01.423940582 +0000 UTC m=+75.294918078" observedRunningTime="2026-04-24 21:30:01.997070097 +0000 UTC m=+75.868047596" watchObservedRunningTime="2026-04-24 21:30:01.998455712 +0000 UTC m=+75.869433432" Apr 24 21:30:23.373688 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:30:23.373644 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:30:23.374251 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:30:23.373713 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert\") pod \"ingress-canary-r8jwq\" (UID: \"0ee47bd0-7e68-48fd-8896-e4693d5e8f21\") " pod="openshift-ingress-canary/ingress-canary-r8jwq" Apr 24 21:30:23.374251 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:30:23.373758 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:30:23.374251 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:30:23.373819 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:30:23.374251 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:30:23.373829 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:30:23.374251 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:30:23.373861 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65859fb559-pxvcf: secret "image-registry-tls" not found Apr 24 21:30:23.374251 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:30:23.373892 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert podName:0ee47bd0-7e68-48fd-8896-e4693d5e8f21 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:27.373874432 +0000 UTC m=+161.244851915 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert") pod "ingress-canary-r8jwq" (UID: "0ee47bd0-7e68-48fd-8896-e4693d5e8f21") : secret "canary-serving-cert" not found Apr 24 21:30:23.374251 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:30:23.373837 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:30:23.374251 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:30:23.373935 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls podName:263f8ce6-7a53-4ba0-808d-ac71652fdc4d nodeName:}" failed. No retries permitted until 2026-04-24 21:31:27.373913932 +0000 UTC m=+161.244891422 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls") pod "image-registry-65859fb559-pxvcf" (UID: "263f8ce6-7a53-4ba0-808d-ac71652fdc4d") : secret "image-registry-tls" not found Apr 24 21:30:23.374251 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:30:23.373989 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls podName:3854f8f7-804b-4511-a3f3-1b96449f8b70 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:27.373977876 +0000 UTC m=+161.244955353 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls") pod "dns-default-jtvxc" (UID: "3854f8f7-804b-4511-a3f3-1b96449f8b70") : secret "dns-default-metrics-tls" not found Apr 24 21:30:56.393778 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:30:56.393728 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs\") pod \"network-metrics-daemon-tdnnb\" (UID: \"fba6f53a-a544-4d53-ba11-2dd3b3259ed0\") " pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:30:56.394304 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:30:56.393869 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:30:56.394304 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:30:56.393939 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs podName:fba6f53a-a544-4d53-ba11-2dd3b3259ed0 nodeName:}" failed. No retries permitted until 2026-04-24 21:32:58.393923904 +0000 UTC m=+252.264901381 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs") pod "network-metrics-daemon-tdnnb" (UID: "fba6f53a-a544-4d53-ba11-2dd3b3259ed0") : secret "metrics-daemon-secret" not found Apr 24 21:31:21.039695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:21.039666 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cwvxk_da82016d-3774-4430-881a-6479d2a7aa8c/dns-node-resolver/0.log" Apr 24 21:31:22.041623 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:22.041591 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cdrzz_2fc5cade-b0b3-414a-88b0-ae3c0348001f/node-ca/0.log" Apr 24 21:31:22.503991 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:31:22.503947 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-65859fb559-pxvcf" podUID="263f8ce6-7a53-4ba0-808d-ac71652fdc4d" Apr 24 21:31:22.510101 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:31:22.510074 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-jtvxc" podUID="3854f8f7-804b-4511-a3f3-1b96449f8b70" Apr 24 21:31:22.522377 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:31:22.522349 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-r8jwq" podUID="0ee47bd0-7e68-48fd-8896-e4693d5e8f21" Apr 24 21:31:22.735266 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:31:22.735235 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-tdnnb" podUID="fba6f53a-a544-4d53-ba11-2dd3b3259ed0" Apr 24 21:31:23.134327 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:23.134297 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:31:23.134672 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:23.134297 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jtvxc" Apr 24 21:31:27.401665 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:27.401572 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:31:27.401665 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:27.401632 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert\") pod \"ingress-canary-r8jwq\" (UID: \"0ee47bd0-7e68-48fd-8896-e4693d5e8f21\") " pod="openshift-ingress-canary/ingress-canary-r8jwq" Apr 24 21:31:27.401665 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:27.401658 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:31:27.403888 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:27.403864 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3854f8f7-804b-4511-a3f3-1b96449f8b70-metrics-tls\") pod \"dns-default-jtvxc\" (UID: \"3854f8f7-804b-4511-a3f3-1b96449f8b70\") " pod="openshift-dns/dns-default-jtvxc" Apr 24 21:31:27.404069 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:27.404047 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls\") pod \"image-registry-65859fb559-pxvcf\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:31:27.404129 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:27.404076 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee47bd0-7e68-48fd-8896-e4693d5e8f21-cert\") pod \"ingress-canary-r8jwq\" (UID: \"0ee47bd0-7e68-48fd-8896-e4693d5e8f21\") " pod="openshift-ingress-canary/ingress-canary-r8jwq" Apr 24 21:31:27.638084 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:27.638053 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ptjx6\"" Apr 24 21:31:27.638084 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:27.638055 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-sbjvw\"" Apr 24 21:31:27.645306 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:27.645287 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jtvxc" Apr 24 21:31:27.645414 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:27.645372 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:31:27.768956 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:27.768919 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jtvxc"] Apr 24 21:31:27.772066 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:31:27.772038 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3854f8f7_804b_4511_a3f3_1b96449f8b70.slice/crio-49bccfdfae8d9b9c6d0953da4b5232cda7b4a03a5b844bfee5180582449e85e4 WatchSource:0}: Error finding container 49bccfdfae8d9b9c6d0953da4b5232cda7b4a03a5b844bfee5180582449e85e4: Status 404 returned error can't find the container with id 49bccfdfae8d9b9c6d0953da4b5232cda7b4a03a5b844bfee5180582449e85e4 Apr 24 21:31:27.785443 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:27.785417 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-65859fb559-pxvcf"] Apr 24 21:31:27.788150 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:31:27.788130 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod263f8ce6_7a53_4ba0_808d_ac71652fdc4d.slice/crio-fb5013c9b2aa9f942fae378054f9c2ebda38c2975e76ace116192c0e870ead54 WatchSource:0}: Error finding container fb5013c9b2aa9f942fae378054f9c2ebda38c2975e76ace116192c0e870ead54: Status 404 returned error can't find the container with id fb5013c9b2aa9f942fae378054f9c2ebda38c2975e76ace116192c0e870ead54 Apr 24 21:31:28.146392 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:28.146352 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65859fb559-pxvcf" event={"ID":"263f8ce6-7a53-4ba0-808d-ac71652fdc4d","Type":"ContainerStarted","Data":"d68ad8470c23f20e60d2187581655c04a391538c9aa7afb534a7642e7b9653e9"} Apr 24 21:31:28.146392 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:28.146390 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65859fb559-pxvcf" event={"ID":"263f8ce6-7a53-4ba0-808d-ac71652fdc4d","Type":"ContainerStarted","Data":"fb5013c9b2aa9f942fae378054f9c2ebda38c2975e76ace116192c0e870ead54"} Apr 24 21:31:28.146592 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:28.146487 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:31:28.147331 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:28.147309 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jtvxc" event={"ID":"3854f8f7-804b-4511-a3f3-1b96449f8b70","Type":"ContainerStarted","Data":"49bccfdfae8d9b9c6d0953da4b5232cda7b4a03a5b844bfee5180582449e85e4"} Apr 24 21:31:28.172164 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:28.172126 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-65859fb559-pxvcf" podStartSLOduration=168.172114146 podStartE2EDuration="2m48.172114146s" podCreationTimestamp="2026-04-24 21:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:31:28.171648177 +0000 UTC m=+162.042625679" watchObservedRunningTime="2026-04-24 21:31:28.172114146 +0000 UTC m=+162.043091644" Apr 24 21:31:30.153242 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:30.153193 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jtvxc" event={"ID":"3854f8f7-804b-4511-a3f3-1b96449f8b70","Type":"ContainerStarted","Data":"f9f5ffae49b08aaa5d683ae0e753345b98b319ecab29f93b64993fc2947d5aa1"} Apr 24 21:31:30.153632 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:30.153252 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jtvxc" event={"ID":"3854f8f7-804b-4511-a3f3-1b96449f8b70","Type":"ContainerStarted","Data":"16bbf1a44245dc6d0a673da1cac8af40413ae302652b534d7c9ae110148f0128"} Apr 24 21:31:30.153632 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:30.153382 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-jtvxc" Apr 24 21:31:33.712020 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:33.711985 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r8jwq" Apr 24 21:31:33.716461 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:33.716444 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kxf55\"" Apr 24 21:31:33.722897 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:33.722878 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r8jwq" Apr 24 21:31:33.842782 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:33.842739 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jtvxc" podStartSLOduration=133.540969762 podStartE2EDuration="2m14.842721316s" podCreationTimestamp="2026-04-24 21:29:19 +0000 UTC" firstStartedPulling="2026-04-24 21:31:27.773681848 +0000 UTC m=+161.644659325" lastFinishedPulling="2026-04-24 21:31:29.075433403 +0000 UTC m=+162.946410879" observedRunningTime="2026-04-24 21:31:30.179729378 +0000 UTC m=+164.050706892" watchObservedRunningTime="2026-04-24 21:31:33.842721316 +0000 UTC m=+167.713698814" Apr 24 21:31:33.843038 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:33.843024 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r8jwq"] Apr 24 21:31:33.845958 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:31:33.845920 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ee47bd0_7e68_48fd_8896_e4693d5e8f21.slice/crio-0f7c56bc48c7c2779827aff1db8e76b655ccfda04f23632c642430cc9db503a4 WatchSource:0}: Error finding container 0f7c56bc48c7c2779827aff1db8e76b655ccfda04f23632c642430cc9db503a4: Status 404 returned error can't find the container with id 0f7c56bc48c7c2779827aff1db8e76b655ccfda04f23632c642430cc9db503a4 Apr 24 21:31:34.165268 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:34.165217 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r8jwq" event={"ID":"0ee47bd0-7e68-48fd-8896-e4693d5e8f21","Type":"ContainerStarted","Data":"0f7c56bc48c7c2779827aff1db8e76b655ccfda04f23632c642430cc9db503a4"} Apr 24 21:31:36.172741 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:36.172686 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r8jwq" event={"ID":"0ee47bd0-7e68-48fd-8896-e4693d5e8f21","Type":"ContainerStarted","Data":"0a011c4423592d7bb883a298f9cc592f375019b44a76ea3a560ea0b86c58bed5"} Apr 24 21:31:36.189480 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:36.189429 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-r8jwq" podStartSLOduration=135.763960943 podStartE2EDuration="2m17.189415387s" podCreationTimestamp="2026-04-24 21:29:19 +0000 UTC" firstStartedPulling="2026-04-24 21:31:33.847800679 +0000 UTC m=+167.718778156" lastFinishedPulling="2026-04-24 21:31:35.273255114 +0000 UTC m=+169.144232600" observedRunningTime="2026-04-24 21:31:36.188607538 +0000 UTC m=+170.059585078" watchObservedRunningTime="2026-04-24 21:31:36.189415387 +0000 UTC m=+170.060392932" Apr 24 21:31:36.713294 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:36.713258 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:31:40.158754 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:40.158726 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jtvxc" Apr 24 21:31:46.490922 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.490894 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-lrjh7"] Apr 24 21:31:46.494446 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.494430 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lrjh7" Apr 24 21:31:46.498339 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.498279 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:31:46.499187 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.499162 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:31:46.499337 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.499207 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:31:46.499337 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.499260 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:31:46.499337 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.499301 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-r5h69\"" Apr 24 21:31:46.508520 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.508501 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lrjh7"] Apr 24 21:31:46.523724 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.523699 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/55b8274d-55b8-458d-848c-e205acc6cd3b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lrjh7\" (UID: \"55b8274d-55b8-458d-848c-e205acc6cd3b\") " pod="openshift-insights/insights-runtime-extractor-lrjh7" Apr 24 21:31:46.523808 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.523736 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/55b8274d-55b8-458d-848c-e205acc6cd3b-crio-socket\") pod \"insights-runtime-extractor-lrjh7\" (UID: \"55b8274d-55b8-458d-848c-e205acc6cd3b\") " pod="openshift-insights/insights-runtime-extractor-lrjh7" Apr 24 21:31:46.523808 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.523754 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tnk9\" (UniqueName: \"kubernetes.io/projected/55b8274d-55b8-458d-848c-e205acc6cd3b-kube-api-access-6tnk9\") pod \"insights-runtime-extractor-lrjh7\" (UID: \"55b8274d-55b8-458d-848c-e205acc6cd3b\") " pod="openshift-insights/insights-runtime-extractor-lrjh7" Apr 24 21:31:46.523870 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.523807 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/55b8274d-55b8-458d-848c-e205acc6cd3b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lrjh7\" (UID: \"55b8274d-55b8-458d-848c-e205acc6cd3b\") " pod="openshift-insights/insights-runtime-extractor-lrjh7" Apr 24 21:31:46.523870 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.523834 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/55b8274d-55b8-458d-848c-e205acc6cd3b-data-volume\") pod \"insights-runtime-extractor-lrjh7\" (UID: \"55b8274d-55b8-458d-848c-e205acc6cd3b\") " pod="openshift-insights/insights-runtime-extractor-lrjh7" Apr 24 21:31:46.532003 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.531984 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-65859fb559-pxvcf"] Apr 24 21:31:46.536341 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.536309 2574 patch_prober.go:28] interesting pod/image-registry-65859fb559-pxvcf container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 21:31:46.536436 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.536363 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-65859fb559-pxvcf" podUID="263f8ce6-7a53-4ba0-808d-ac71652fdc4d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:31:46.624435 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.624413 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/55b8274d-55b8-458d-848c-e205acc6cd3b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lrjh7\" (UID: \"55b8274d-55b8-458d-848c-e205acc6cd3b\") " pod="openshift-insights/insights-runtime-extractor-lrjh7" Apr 24 21:31:46.624526 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.624440 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/55b8274d-55b8-458d-848c-e205acc6cd3b-crio-socket\") pod \"insights-runtime-extractor-lrjh7\" (UID: \"55b8274d-55b8-458d-848c-e205acc6cd3b\") " pod="openshift-insights/insights-runtime-extractor-lrjh7" Apr 24 21:31:46.624526 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.624455 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tnk9\" (UniqueName: \"kubernetes.io/projected/55b8274d-55b8-458d-848c-e205acc6cd3b-kube-api-access-6tnk9\") pod \"insights-runtime-extractor-lrjh7\" (UID: \"55b8274d-55b8-458d-848c-e205acc6cd3b\") " pod="openshift-insights/insights-runtime-extractor-lrjh7" Apr 24 21:31:46.624526 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.624475 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/55b8274d-55b8-458d-848c-e205acc6cd3b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lrjh7\" (UID: \"55b8274d-55b8-458d-848c-e205acc6cd3b\") " pod="openshift-insights/insights-runtime-extractor-lrjh7" Apr 24 21:31:46.624626 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.624544 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/55b8274d-55b8-458d-848c-e205acc6cd3b-crio-socket\") pod \"insights-runtime-extractor-lrjh7\" (UID: \"55b8274d-55b8-458d-848c-e205acc6cd3b\") " pod="openshift-insights/insights-runtime-extractor-lrjh7" Apr 24 21:31:46.624626 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.624578 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/55b8274d-55b8-458d-848c-e205acc6cd3b-data-volume\") pod \"insights-runtime-extractor-lrjh7\" (UID: \"55b8274d-55b8-458d-848c-e205acc6cd3b\") " pod="openshift-insights/insights-runtime-extractor-lrjh7" Apr 24 21:31:46.624879 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.624865 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/55b8274d-55b8-458d-848c-e205acc6cd3b-data-volume\") pod \"insights-runtime-extractor-lrjh7\" (UID: \"55b8274d-55b8-458d-848c-e205acc6cd3b\") " pod="openshift-insights/insights-runtime-extractor-lrjh7" Apr 24 21:31:46.626569 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.626556 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:31:46.626642 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.626628 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:31:46.633327 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.633313 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:31:46.635116 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.635094 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/55b8274d-55b8-458d-848c-e205acc6cd3b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lrjh7\" (UID: \"55b8274d-55b8-458d-848c-e205acc6cd3b\") " pod="openshift-insights/insights-runtime-extractor-lrjh7" Apr 24 21:31:46.636717 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.636702 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/55b8274d-55b8-458d-848c-e205acc6cd3b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lrjh7\" (UID: \"55b8274d-55b8-458d-848c-e205acc6cd3b\") " pod="openshift-insights/insights-runtime-extractor-lrjh7" Apr 24 21:31:46.643565 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.643548 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:31:46.654566 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.654550 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tnk9\" (UniqueName: \"kubernetes.io/projected/55b8274d-55b8-458d-848c-e205acc6cd3b-kube-api-access-6tnk9\") pod \"insights-runtime-extractor-lrjh7\" (UID: \"55b8274d-55b8-458d-848c-e205acc6cd3b\") " pod="openshift-insights/insights-runtime-extractor-lrjh7" Apr 24 21:31:46.805789 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.805734 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-r5h69\"" Apr 24 21:31:46.814630 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.814614 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lrjh7" Apr 24 21:31:46.928814 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:46.928783 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lrjh7"] Apr 24 21:31:46.934422 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:31:46.934389 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55b8274d_55b8_458d_848c_e205acc6cd3b.slice/crio-07eab7612eb0b6f32bf2d838b976760cc4789fa61938d2628515417420c1c577 WatchSource:0}: Error finding container 07eab7612eb0b6f32bf2d838b976760cc4789fa61938d2628515417420c1c577: Status 404 returned error can't find the container with id 07eab7612eb0b6f32bf2d838b976760cc4789fa61938d2628515417420c1c577 Apr 24 21:31:47.198489 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:47.198461 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lrjh7" event={"ID":"55b8274d-55b8-458d-848c-e205acc6cd3b","Type":"ContainerStarted","Data":"c1256c45fcfaa28e3e0d281e726cf3b54c5cb2dad4fd2f0cc5268767ff812f75"} Apr 24 21:31:47.198609 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:47.198498 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lrjh7" event={"ID":"55b8274d-55b8-458d-848c-e205acc6cd3b","Type":"ContainerStarted","Data":"07eab7612eb0b6f32bf2d838b976760cc4789fa61938d2628515417420c1c577"} Apr 24 21:31:48.203442 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:48.203410 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lrjh7" event={"ID":"55b8274d-55b8-458d-848c-e205acc6cd3b","Type":"ContainerStarted","Data":"9e52b2c17413e4d05cac246a89c689ef162328f95d18d3212602ca62c0f90d9c"} Apr 24 21:31:49.206773 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:49.206732 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lrjh7" event={"ID":"55b8274d-55b8-458d-848c-e205acc6cd3b","Type":"ContainerStarted","Data":"3b7d4c92ddf4ad486777c1c1dbbb0e687cbbe3fd0cf4add59bb8099216484b97"} Apr 24 21:31:49.228538 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:49.228494 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-lrjh7" podStartSLOduration=1.107497546 podStartE2EDuration="3.228481646s" podCreationTimestamp="2026-04-24 21:31:46 +0000 UTC" firstStartedPulling="2026-04-24 21:31:46.987814959 +0000 UTC m=+180.858792436" lastFinishedPulling="2026-04-24 21:31:49.108799058 +0000 UTC m=+182.979776536" observedRunningTime="2026-04-24 21:31:49.226976464 +0000 UTC m=+183.097953990" watchObservedRunningTime="2026-04-24 21:31:49.228481646 +0000 UTC m=+183.099459145" Apr 24 21:31:55.263858 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.263825 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk"] Apr 24 21:31:55.266907 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.266890 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" Apr 24 21:31:55.270114 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.270085 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:31:55.270114 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.270102 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:31:55.270367 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.270351 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:31:55.270426 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.270396 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-cj5pr\"" Apr 24 21:31:55.270489 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.270430 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 21:31:55.270489 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.270437 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 21:31:55.277524 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.277502 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk"] Apr 24 21:31:55.296344 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.296324 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-rm2gn"] Apr 24 21:31:55.299105 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.299091 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.301287 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.301266 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:31:55.301287 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.301276 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-zmm79\"" Apr 24 21:31:55.301480 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.301466 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:31:55.301528 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.301491 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:31:55.384094 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.384070 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/98a3c479-04bc-4dfa-850a-9146a8ebcda5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-dd2lk\" (UID: \"98a3c479-04bc-4dfa-850a-9146a8ebcda5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" Apr 24 21:31:55.384193 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.384096 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/47714f4e-937b-411d-b7ba-1d3c613a90ab-node-exporter-accelerators-collector-config\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.384193 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.384120 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47714f4e-937b-411d-b7ba-1d3c613a90ab-metrics-client-ca\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.384193 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.384137 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sm69\" (UniqueName: \"kubernetes.io/projected/47714f4e-937b-411d-b7ba-1d3c613a90ab-kube-api-access-9sm69\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.384193 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.384152 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/47714f4e-937b-411d-b7ba-1d3c613a90ab-node-exporter-wtmp\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.384379 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.384203 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98a3c479-04bc-4dfa-850a-9146a8ebcda5-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-dd2lk\" (UID: \"98a3c479-04bc-4dfa-850a-9146a8ebcda5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" Apr 24 21:31:55.384379 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.384242 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9r54\" (UniqueName: \"kubernetes.io/projected/98a3c479-04bc-4dfa-850a-9146a8ebcda5-kube-api-access-b9r54\") pod \"openshift-state-metrics-9d44df66c-dd2lk\" (UID: \"98a3c479-04bc-4dfa-850a-9146a8ebcda5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" Apr 24 21:31:55.384379 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.384260 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/47714f4e-937b-411d-b7ba-1d3c613a90ab-node-exporter-tls\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.384379 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.384290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/98a3c479-04bc-4dfa-850a-9146a8ebcda5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-dd2lk\" (UID: \"98a3c479-04bc-4dfa-850a-9146a8ebcda5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" Apr 24 21:31:55.384379 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.384333 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/47714f4e-937b-411d-b7ba-1d3c613a90ab-root\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.384379 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.384356 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/47714f4e-937b-411d-b7ba-1d3c613a90ab-node-exporter-textfile\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.384379 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.384375 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47714f4e-937b-411d-b7ba-1d3c613a90ab-sys\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.384637 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.384393 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/47714f4e-937b-411d-b7ba-1d3c613a90ab-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.484963 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.484933 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47714f4e-937b-411d-b7ba-1d3c613a90ab-metrics-client-ca\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.484963 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.484966 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9sm69\" (UniqueName: \"kubernetes.io/projected/47714f4e-937b-411d-b7ba-1d3c613a90ab-kube-api-access-9sm69\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.485155 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.484985 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/47714f4e-937b-411d-b7ba-1d3c613a90ab-node-exporter-wtmp\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.485155 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.485041 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98a3c479-04bc-4dfa-850a-9146a8ebcda5-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-dd2lk\" (UID: \"98a3c479-04bc-4dfa-850a-9146a8ebcda5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" Apr 24 21:31:55.485155 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.485076 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9r54\" (UniqueName: \"kubernetes.io/projected/98a3c479-04bc-4dfa-850a-9146a8ebcda5-kube-api-access-b9r54\") pod \"openshift-state-metrics-9d44df66c-dd2lk\" (UID: \"98a3c479-04bc-4dfa-850a-9146a8ebcda5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" Apr 24 21:31:55.485155 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.485100 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/47714f4e-937b-411d-b7ba-1d3c613a90ab-node-exporter-wtmp\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.485155 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.485104 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/47714f4e-937b-411d-b7ba-1d3c613a90ab-node-exporter-tls\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.485432 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.485317 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/98a3c479-04bc-4dfa-850a-9146a8ebcda5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-dd2lk\" (UID: \"98a3c479-04bc-4dfa-850a-9146a8ebcda5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" Apr 24 21:31:55.485432 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.485342 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/47714f4e-937b-411d-b7ba-1d3c613a90ab-root\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.485432 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.485366 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/47714f4e-937b-411d-b7ba-1d3c613a90ab-node-exporter-textfile\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.485577 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.485435 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/47714f4e-937b-411d-b7ba-1d3c613a90ab-root\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.485577 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.485483 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47714f4e-937b-411d-b7ba-1d3c613a90ab-sys\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.485577 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.485519 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/47714f4e-937b-411d-b7ba-1d3c613a90ab-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.485577 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.485561 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/98a3c479-04bc-4dfa-850a-9146a8ebcda5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-dd2lk\" (UID: \"98a3c479-04bc-4dfa-850a-9146a8ebcda5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" Apr 24 21:31:55.485770 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.485586 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/47714f4e-937b-411d-b7ba-1d3c613a90ab-node-exporter-accelerators-collector-config\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.485770 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.485595 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/47714f4e-937b-411d-b7ba-1d3c613a90ab-sys\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.485770 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:31:55.485699 2574 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 24 21:31:55.485770 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:31:55.485769 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98a3c479-04bc-4dfa-850a-9146a8ebcda5-openshift-state-metrics-tls podName:98a3c479-04bc-4dfa-850a-9146a8ebcda5 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:55.985748341 +0000 UTC m=+189.856725822 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/98a3c479-04bc-4dfa-850a-9146a8ebcda5-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-dd2lk" (UID: "98a3c479-04bc-4dfa-850a-9146a8ebcda5") : secret "openshift-state-metrics-tls" not found Apr 24 21:31:55.485963 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.485795 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47714f4e-937b-411d-b7ba-1d3c613a90ab-metrics-client-ca\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.485963 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.485862 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98a3c479-04bc-4dfa-850a-9146a8ebcda5-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-dd2lk\" (UID: \"98a3c479-04bc-4dfa-850a-9146a8ebcda5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" Apr 24 21:31:55.486053 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.486042 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/47714f4e-937b-411d-b7ba-1d3c613a90ab-node-exporter-accelerators-collector-config\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.486296 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.486276 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/47714f4e-937b-411d-b7ba-1d3c613a90ab-node-exporter-textfile\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.487536 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.487518 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/47714f4e-937b-411d-b7ba-1d3c613a90ab-node-exporter-tls\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.487671 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.487649 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/98a3c479-04bc-4dfa-850a-9146a8ebcda5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-dd2lk\" (UID: \"98a3c479-04bc-4dfa-850a-9146a8ebcda5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" Apr 24 21:31:55.487790 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.487773 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/47714f4e-937b-411d-b7ba-1d3c613a90ab-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.494165 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.494141 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sm69\" (UniqueName: \"kubernetes.io/projected/47714f4e-937b-411d-b7ba-1d3c613a90ab-kube-api-access-9sm69\") pod \"node-exporter-rm2gn\" (UID: \"47714f4e-937b-411d-b7ba-1d3c613a90ab\") " pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.494926 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.494902 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9r54\" (UniqueName: \"kubernetes.io/projected/98a3c479-04bc-4dfa-850a-9146a8ebcda5-kube-api-access-b9r54\") pod \"openshift-state-metrics-9d44df66c-dd2lk\" (UID: \"98a3c479-04bc-4dfa-850a-9146a8ebcda5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" Apr 24 21:31:55.607618 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.607545 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rm2gn" Apr 24 21:31:55.616378 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:31:55.616352 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47714f4e_937b_411d_b7ba_1d3c613a90ab.slice/crio-223580b2c1d3d69368331c11087a36c55071d7edc98476b2495ebafdeb14a37b WatchSource:0}: Error finding container 223580b2c1d3d69368331c11087a36c55071d7edc98476b2495ebafdeb14a37b: Status 404 returned error can't find the container with id 223580b2c1d3d69368331c11087a36c55071d7edc98476b2495ebafdeb14a37b Apr 24 21:31:55.989902 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.989869 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/98a3c479-04bc-4dfa-850a-9146a8ebcda5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-dd2lk\" (UID: \"98a3c479-04bc-4dfa-850a-9146a8ebcda5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" Apr 24 21:31:55.992238 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:55.992206 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/98a3c479-04bc-4dfa-850a-9146a8ebcda5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-dd2lk\" (UID: \"98a3c479-04bc-4dfa-850a-9146a8ebcda5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" Apr 24 21:31:56.175940 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.175911 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" Apr 24 21:31:56.225949 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.225911 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rm2gn" event={"ID":"47714f4e-937b-411d-b7ba-1d3c613a90ab","Type":"ContainerStarted","Data":"223580b2c1d3d69368331c11087a36c55071d7edc98476b2495ebafdeb14a37b"} Apr 24 21:31:56.369905 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.369881 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:56.374896 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.374810 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.377840 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.377399 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 21:31:56.377840 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.377615 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 21:31:56.377840 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.377755 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 21:31:56.377840 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.377758 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 21:31:56.378086 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.377863 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 21:31:56.378086 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.377867 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 21:31:56.378086 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.377937 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 21:31:56.378086 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.377967 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-7pqz5\"" Apr 24 21:31:56.378086 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.378013 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 21:31:56.384054 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.384031 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 21:31:56.387558 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.387508 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:56.417271 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.417245 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk"] Apr 24 21:31:56.420372 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:31:56.420349 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98a3c479_04bc_4dfa_850a_9146a8ebcda5.slice/crio-d40f007e8898a077a362f3dde69b152b2b6c3a654ac60aa1781fb60e6b3209a1 WatchSource:0}: Error finding container d40f007e8898a077a362f3dde69b152b2b6c3a654ac60aa1781fb60e6b3209a1: Status 404 returned error can't find the container with id d40f007e8898a077a362f3dde69b152b2b6c3a654ac60aa1781fb60e6b3209a1 Apr 24 21:31:56.494262 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.493746 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/637b37cc-d73f-40aa-a2ce-c867a036839e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.494262 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.493790 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/637b37cc-d73f-40aa-a2ce-c867a036839e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.494262 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.493852 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.494262 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.493878 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.494262 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.493904 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/637b37cc-d73f-40aa-a2ce-c867a036839e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.494262 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.493927 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/637b37cc-d73f-40aa-a2ce-c867a036839e-config-out\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.494262 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.493959 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.494262 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.493985 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-web-config\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.494262 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.494042 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxgss\" (UniqueName: \"kubernetes.io/projected/637b37cc-d73f-40aa-a2ce-c867a036839e-kube-api-access-xxgss\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.494262 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.494077 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-config-volume\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.494262 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.494103 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/637b37cc-d73f-40aa-a2ce-c867a036839e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.494262 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.494127 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.494262 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.494151 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.536662 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.536643 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:31:56.594976 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.594919 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.594976 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.594950 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.594976 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.594969 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/637b37cc-d73f-40aa-a2ce-c867a036839e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.594976 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.594985 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/637b37cc-d73f-40aa-a2ce-c867a036839e-config-out\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.595274 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.595113 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.595274 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.595154 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-web-config\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.595274 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.595196 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxgss\" (UniqueName: \"kubernetes.io/projected/637b37cc-d73f-40aa-a2ce-c867a036839e-kube-api-access-xxgss\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.595431 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.595294 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-config-volume\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.595431 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:31:56.595312 2574 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 24 21:31:56.595431 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.595330 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/637b37cc-d73f-40aa-a2ce-c867a036839e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.595431 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:31:56.595365 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-main-tls podName:637b37cc-d73f-40aa-a2ce-c867a036839e nodeName:}" failed. No retries permitted until 2026-04-24 21:31:57.095345622 +0000 UTC m=+190.966323111 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "637b37cc-d73f-40aa-a2ce-c867a036839e") : secret "alertmanager-main-tls" not found Apr 24 21:31:56.595431 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.595384 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.595431 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.595415 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.595722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.595454 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/637b37cc-d73f-40aa-a2ce-c867a036839e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.595722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.595484 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/637b37cc-d73f-40aa-a2ce-c867a036839e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.595722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.595713 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/637b37cc-d73f-40aa-a2ce-c867a036839e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.596385 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.596331 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/637b37cc-d73f-40aa-a2ce-c867a036839e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.596481 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.596310 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/637b37cc-d73f-40aa-a2ce-c867a036839e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.597723 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.597699 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/637b37cc-d73f-40aa-a2ce-c867a036839e-config-out\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.598033 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.597991 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/637b37cc-d73f-40aa-a2ce-c867a036839e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.598121 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.598090 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.598840 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.598815 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-web-config\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.598997 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.598971 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.599086 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.599012 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-config-volume\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.599378 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.599356 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.600065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.600043 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:56.605399 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:56.605379 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxgss\" (UniqueName: \"kubernetes.io/projected/637b37cc-d73f-40aa-a2ce-c867a036839e-kube-api-access-xxgss\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:57.099426 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:57.099384 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:57.102045 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:57.102015 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:57.230009 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:57.229975 2574 generic.go:358] "Generic (PLEG): container finished" podID="47714f4e-937b-411d-b7ba-1d3c613a90ab" containerID="b593a70afcfb7195204d9cf15b2c19036a94d89ed2fda33bbd52596c4968726a" exitCode=0 Apr 24 21:31:57.230181 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:57.230065 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rm2gn" event={"ID":"47714f4e-937b-411d-b7ba-1d3c613a90ab","Type":"ContainerDied","Data":"b593a70afcfb7195204d9cf15b2c19036a94d89ed2fda33bbd52596c4968726a"} Apr 24 21:31:57.231828 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:57.231803 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" event={"ID":"98a3c479-04bc-4dfa-850a-9146a8ebcda5","Type":"ContainerStarted","Data":"22128619f5c010548842b2e58ac18b0fb55d992820db820f4781a22fb15ebf28"} Apr 24 21:31:57.231932 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:57.231835 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" event={"ID":"98a3c479-04bc-4dfa-850a-9146a8ebcda5","Type":"ContainerStarted","Data":"70316f571ee4da8fea4de820e1605412c071d4b6eb254c111e162e23a5328a36"} Apr 24 21:31:57.231932 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:57.231848 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" event={"ID":"98a3c479-04bc-4dfa-850a-9146a8ebcda5","Type":"ContainerStarted","Data":"d40f007e8898a077a362f3dde69b152b2b6c3a654ac60aa1781fb60e6b3209a1"} Apr 24 21:31:57.289671 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:57.289650 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:57.585316 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:57.585294 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:57.588136 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:31:57.588110 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod637b37cc_d73f_40aa_a2ce_c867a036839e.slice/crio-76ced6527a754993a925ccf27357b421cdc6ab575e37f005e758a9188ac83970 WatchSource:0}: Error finding container 76ced6527a754993a925ccf27357b421cdc6ab575e37f005e758a9188ac83970: Status 404 returned error can't find the container with id 76ced6527a754993a925ccf27357b421cdc6ab575e37f005e758a9188ac83970 Apr 24 21:31:58.237466 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:58.237396 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rm2gn" event={"ID":"47714f4e-937b-411d-b7ba-1d3c613a90ab","Type":"ContainerStarted","Data":"01a903d37c7cc19d11ed4b30d6ad2b08279beb5cf9ebffec4237dd02b2f3aa32"} Apr 24 21:31:58.237466 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:58.237438 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rm2gn" event={"ID":"47714f4e-937b-411d-b7ba-1d3c613a90ab","Type":"ContainerStarted","Data":"95bd12443c4ac628bc65bdc979ded979fcf6830089454c0d1147947d72bc1432"} Apr 24 21:31:58.241336 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:58.241068 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" event={"ID":"98a3c479-04bc-4dfa-850a-9146a8ebcda5","Type":"ContainerStarted","Data":"768d5b8f87483a8fd6eb8524dfdd4001bca91a7cbad29bf05c84946aaf267cc1"} Apr 24 21:31:58.243750 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:58.243722 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"637b37cc-d73f-40aa-a2ce-c867a036839e","Type":"ContainerStarted","Data":"76ced6527a754993a925ccf27357b421cdc6ab575e37f005e758a9188ac83970"} Apr 24 21:31:58.272103 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:58.272053 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-rm2gn" podStartSLOduration=2.545057358 podStartE2EDuration="3.272036407s" podCreationTimestamp="2026-04-24 21:31:55 +0000 UTC" firstStartedPulling="2026-04-24 21:31:55.618454004 +0000 UTC m=+189.489431482" lastFinishedPulling="2026-04-24 21:31:56.345433045 +0000 UTC m=+190.216410531" observedRunningTime="2026-04-24 21:31:58.2696776 +0000 UTC m=+192.140655100" watchObservedRunningTime="2026-04-24 21:31:58.272036407 +0000 UTC m=+192.143013909" Apr 24 21:31:58.296831 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:58.296791 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dd2lk" podStartSLOduration=2.320660959 podStartE2EDuration="3.296776356s" podCreationTimestamp="2026-04-24 21:31:55 +0000 UTC" firstStartedPulling="2026-04-24 21:31:56.534729627 +0000 UTC m=+190.405707106" lastFinishedPulling="2026-04-24 21:31:57.510845018 +0000 UTC m=+191.381822503" observedRunningTime="2026-04-24 21:31:58.296259801 +0000 UTC m=+192.167237302" watchObservedRunningTime="2026-04-24 21:31:58.296776356 +0000 UTC m=+192.167753856" Apr 24 21:31:59.249660 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:59.249626 2574 generic.go:358] "Generic (PLEG): container finished" podID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerID="c2539c3186108c5e40af4c1e10bda06907058d704cb0ef3b03349a3f62c2f8a5" exitCode=0 Apr 24 21:31:59.250039 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:31:59.249704 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"637b37cc-d73f-40aa-a2ce-c867a036839e","Type":"ContainerDied","Data":"c2539c3186108c5e40af4c1e10bda06907058d704cb0ef3b03349a3f62c2f8a5"} Apr 24 21:32:00.050523 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:00.050493 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-bkrnl"] Apr 24 21:32:00.053561 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:00.053540 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bkrnl" Apr 24 21:32:00.055821 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:00.055800 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 21:32:00.056024 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:00.056006 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-rvfz5\"" Apr 24 21:32:00.060668 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:00.060621 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-bkrnl"] Apr 24 21:32:00.125006 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:00.124939 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/25135849-7d9d-4889-aa15-bd6bcbd9cf27-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-bkrnl\" (UID: \"25135849-7d9d-4889-aa15-bd6bcbd9cf27\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bkrnl" Apr 24 21:32:00.226421 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:00.226373 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/25135849-7d9d-4889-aa15-bd6bcbd9cf27-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-bkrnl\" (UID: \"25135849-7d9d-4889-aa15-bd6bcbd9cf27\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bkrnl" Apr 24 21:32:00.228947 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:00.228920 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/25135849-7d9d-4889-aa15-bd6bcbd9cf27-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-bkrnl\" (UID: \"25135849-7d9d-4889-aa15-bd6bcbd9cf27\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bkrnl" Apr 24 21:32:00.365144 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:00.365122 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bkrnl" Apr 24 21:32:00.584075 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:00.584048 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-bkrnl"] Apr 24 21:32:00.586800 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:32:00.586773 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25135849_7d9d_4889_aa15_bd6bcbd9cf27.slice/crio-b199e2ba166349e6e7e9721a6d2c0267db285724a4ac062006855652d23d2470 WatchSource:0}: Error finding container b199e2ba166349e6e7e9721a6d2c0267db285724a4ac062006855652d23d2470: Status 404 returned error can't find the container with id b199e2ba166349e6e7e9721a6d2c0267db285724a4ac062006855652d23d2470 Apr 24 21:32:01.257309 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.257264 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bkrnl" event={"ID":"25135849-7d9d-4889-aa15-bd6bcbd9cf27","Type":"ContainerStarted","Data":"b199e2ba166349e6e7e9721a6d2c0267db285724a4ac062006855652d23d2470"} Apr 24 21:32:01.260473 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.260451 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"637b37cc-d73f-40aa-a2ce-c867a036839e","Type":"ContainerStarted","Data":"89c529c7b38b2597ff355acca89f502079a2cdd7d092129bc0f0a44e5175a028"} Apr 24 21:32:01.260603 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.260488 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"637b37cc-d73f-40aa-a2ce-c867a036839e","Type":"ContainerStarted","Data":"dec5e88614a90574a12094b8b8e8cfdaad6d8d5ebf004cbeef6bf81ffa3e5f8e"} Apr 24 21:32:01.260603 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.260501 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"637b37cc-d73f-40aa-a2ce-c867a036839e","Type":"ContainerStarted","Data":"c4fdbdc3751b5ca6a708004526fc103036af3f3ed3b81da3db81a3ce4f161493"} Apr 24 21:32:01.260603 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.260509 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"637b37cc-d73f-40aa-a2ce-c867a036839e","Type":"ContainerStarted","Data":"bbd4f41c55a91e4156336dc30b2a0c200dd3d6c69d2d2d89d45c506aadde6019"} Apr 24 21:32:01.260603 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.260517 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"637b37cc-d73f-40aa-a2ce-c867a036839e","Type":"ContainerStarted","Data":"d07221168b05a275efd1d3997883eead4db5898a21e83566a54c8d4f2f9cde28"} Apr 24 21:32:01.571236 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.571201 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:32:01.575562 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.575541 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.578074 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.578051 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 21:32:01.578074 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.578069 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 21:32:01.578399 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.578348 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 21:32:01.578399 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.578379 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 21:32:01.578540 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.578384 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 21:32:01.578540 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.578384 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 21:32:01.580056 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.579921 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8u3lfl3th7m67\"" Apr 24 21:32:01.581178 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.581006 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 21:32:01.581178 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.581043 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 21:32:01.581330 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.581295 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 21:32:01.584474 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.581476 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 21:32:01.584474 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.581490 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 21:32:01.584474 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.581632 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-z9sfc\"" Apr 24 21:32:01.584474 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.581797 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 21:32:01.586386 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.586358 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 21:32:01.591892 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.591869 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:32:01.637370 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.637345 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2chzr\" (UniqueName: \"kubernetes.io/projected/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-kube-api-access-2chzr\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.637481 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.637383 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.637481 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.637413 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-web-config\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.637595 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.637486 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.637595 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.637519 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.637595 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.637563 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.637595 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.637589 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.637801 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.637619 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.637801 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.637670 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.637801 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.637693 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.637801 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.637718 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.637801 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.637760 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.637801 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.637786 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.638082 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.637821 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.638082 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.637855 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-config\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.638082 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.637879 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-config-out\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.638082 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.637897 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.638082 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.637946 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.739293 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.739263 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.739293 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.739298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2chzr\" (UniqueName: \"kubernetes.io/projected/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-kube-api-access-2chzr\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.739486 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.739320 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.739486 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.739354 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-web-config\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.739486 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.739390 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.739486 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.739417 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.739681 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.739503 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.739681 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.739539 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.739681 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.739573 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.739681 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.739635 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.739681 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.739662 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.739966 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.739691 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.739966 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.739719 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.739966 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.739746 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.739966 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.739770 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.739966 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.739812 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-config\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.739966 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.739842 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-config-out\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.739966 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.739865 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.740462 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.740345 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.740988 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.740789 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.741080 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.741062 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.744906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.742212 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.744906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.743096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.744906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.743550 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.744906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.743846 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.744906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.743973 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.744906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.744289 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.744906 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.744839 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-config-out\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.745327 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.744918 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-config\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.745327 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.744941 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.745327 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.745051 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.745327 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.745274 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.745514 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.745415 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.745662 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.745642 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.746252 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.746205 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-web-config\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.747937 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.747915 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2chzr\" (UniqueName: \"kubernetes.io/projected/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-kube-api-access-2chzr\") pod \"prometheus-k8s-0\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:01.892751 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:01.892717 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:02.136437 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:02.136411 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:32:02.138829 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:32:02.138804 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ed03c7e_bf02_42a0_90e4_c8c8027b31c2.slice/crio-f5669a9fff98191503b63763758806b84c23b40f56b96f33bee8c365fc8a3dba WatchSource:0}: Error finding container f5669a9fff98191503b63763758806b84c23b40f56b96f33bee8c365fc8a3dba: Status 404 returned error can't find the container with id f5669a9fff98191503b63763758806b84c23b40f56b96f33bee8c365fc8a3dba Apr 24 21:32:02.264695 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:02.264667 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bkrnl" event={"ID":"25135849-7d9d-4889-aa15-bd6bcbd9cf27","Type":"ContainerStarted","Data":"23b36427d1b73591b58db2299bfcf8031b821d4b5f9c006aaad8ae05cefab9d6"} Apr 24 21:32:02.264900 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:02.264875 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bkrnl" Apr 24 21:32:02.268420 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:02.268375 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"637b37cc-d73f-40aa-a2ce-c867a036839e","Type":"ContainerStarted","Data":"b7b4f684313d73cdbba7ebafb993d64a04331e0bbab58fa39291a4156b3d1a4c"} Apr 24 21:32:02.269894 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:02.269871 2574 generic.go:358] "Generic (PLEG): container finished" podID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerID="ab7015300691528ca626f2272af9ca1ca36b3d672f215e61ed49f42ab91b809b" exitCode=0 Apr 24 21:32:02.269981 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:02.269905 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2","Type":"ContainerDied","Data":"ab7015300691528ca626f2272af9ca1ca36b3d672f215e61ed49f42ab91b809b"} Apr 24 21:32:02.269981 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:02.269925 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2","Type":"ContainerStarted","Data":"f5669a9fff98191503b63763758806b84c23b40f56b96f33bee8c365fc8a3dba"} Apr 24 21:32:02.271007 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:02.270991 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bkrnl" Apr 24 21:32:02.286320 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:02.286273 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bkrnl" podStartSLOduration=0.820269722 podStartE2EDuration="2.286261219s" podCreationTimestamp="2026-04-24 21:32:00 +0000 UTC" firstStartedPulling="2026-04-24 21:32:00.588397966 +0000 UTC m=+194.459375443" lastFinishedPulling="2026-04-24 21:32:02.05438946 +0000 UTC m=+195.925366940" observedRunningTime="2026-04-24 21:32:02.286097191 +0000 UTC m=+196.157074687" watchObservedRunningTime="2026-04-24 21:32:02.286261219 +0000 UTC m=+196.157238718" Apr 24 21:32:02.359675 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:02.359638 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.408713901 podStartE2EDuration="6.359625787s" podCreationTimestamp="2026-04-24 21:31:56 +0000 UTC" firstStartedPulling="2026-04-24 21:31:57.589910125 +0000 UTC m=+191.460887602" lastFinishedPulling="2026-04-24 21:32:01.540822009 +0000 UTC m=+195.411799488" observedRunningTime="2026-04-24 21:32:02.358173932 +0000 UTC m=+196.229151431" watchObservedRunningTime="2026-04-24 21:32:02.359625787 +0000 UTC m=+196.230603282" Apr 24 21:32:05.282108 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:05.281435 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2","Type":"ContainerStarted","Data":"3b74c67778b208e9433acdfeba27bbce8cb7ce5fec3dfab52d6e59b7f71f1605"} Apr 24 21:32:05.282108 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:05.281481 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2","Type":"ContainerStarted","Data":"9f6e8b04827930f579ed02910248c0f53de8d96049d28a7edd48c85f511f7223"} Apr 24 21:32:07.294167 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:07.294128 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2","Type":"ContainerStarted","Data":"1758c557f8edc74499e1e5f6886d0f75ef0319b5e6091ebf4e5cae3bd78e1fa8"} Apr 24 21:32:07.294534 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:07.294174 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2","Type":"ContainerStarted","Data":"54c5c6aeb417f8dcbef73dc63bbfa63a2ae49228581a66f0fa919bc224366ac4"} Apr 24 21:32:07.294534 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:07.294187 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2","Type":"ContainerStarted","Data":"985687bb9d968e217885debb4265656f39a6910078302ccbc2bd4ef4d25c19dc"} Apr 24 21:32:07.294534 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:07.294200 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2","Type":"ContainerStarted","Data":"afdca7651dc0c71cfe93d11993a17bae2874d58e4e9b0b4cfc949a2c3caeab53"} Apr 24 21:32:07.327646 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:07.327603 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.054741651 podStartE2EDuration="6.327591513s" podCreationTimestamp="2026-04-24 21:32:01 +0000 UTC" firstStartedPulling="2026-04-24 21:32:02.271236978 +0000 UTC m=+196.142214467" lastFinishedPulling="2026-04-24 21:32:06.544086834 +0000 UTC m=+200.415064329" observedRunningTime="2026-04-24 21:32:07.326323646 +0000 UTC m=+201.197301144" watchObservedRunningTime="2026-04-24 21:32:07.327591513 +0000 UTC m=+201.198569012" Apr 24 21:32:11.550219 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.550154 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-65859fb559-pxvcf" podUID="263f8ce6-7a53-4ba0-808d-ac71652fdc4d" containerName="registry" containerID="cri-o://d68ad8470c23f20e60d2187581655c04a391538c9aa7afb534a7642e7b9653e9" gracePeriod=30 Apr 24 21:32:11.808794 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.808742 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:32:11.893453 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.893432 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:11.927681 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.927657 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls\") pod \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " Apr 24 21:32:11.927771 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.927693 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-certificates\") pod \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " Apr 24 21:32:11.927771 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.927720 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-ca-trust-extracted\") pod \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " Apr 24 21:32:11.927771 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.927738 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-bound-sa-token\") pod \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " Apr 24 21:32:11.927771 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.927768 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-image-registry-private-configuration\") pod \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " Apr 24 21:32:11.927950 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.927799 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-trusted-ca\") pod \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " Apr 24 21:32:11.927950 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.927820 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-installation-pull-secrets\") pod \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " Apr 24 21:32:11.927950 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.927835 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4llr\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-kube-api-access-n4llr\") pod \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\" (UID: \"263f8ce6-7a53-4ba0-808d-ac71652fdc4d\") " Apr 24 21:32:11.928095 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.928077 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "263f8ce6-7a53-4ba0-808d-ac71652fdc4d" (UID: "263f8ce6-7a53-4ba0-808d-ac71652fdc4d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:32:11.928437 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.928399 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "263f8ce6-7a53-4ba0-808d-ac71652fdc4d" (UID: "263f8ce6-7a53-4ba0-808d-ac71652fdc4d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:32:11.930263 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.930192 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "263f8ce6-7a53-4ba0-808d-ac71652fdc4d" (UID: "263f8ce6-7a53-4ba0-808d-ac71652fdc4d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:32:11.930263 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.930248 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "263f8ce6-7a53-4ba0-808d-ac71652fdc4d" (UID: "263f8ce6-7a53-4ba0-808d-ac71652fdc4d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:32:11.930404 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.930342 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "263f8ce6-7a53-4ba0-808d-ac71652fdc4d" (UID: "263f8ce6-7a53-4ba0-808d-ac71652fdc4d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:32:11.930499 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.930476 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-kube-api-access-n4llr" (OuterVolumeSpecName: "kube-api-access-n4llr") pod "263f8ce6-7a53-4ba0-808d-ac71652fdc4d" (UID: "263f8ce6-7a53-4ba0-808d-ac71652fdc4d"). InnerVolumeSpecName "kube-api-access-n4llr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:32:11.930566 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.930520 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "263f8ce6-7a53-4ba0-808d-ac71652fdc4d" (UID: "263f8ce6-7a53-4ba0-808d-ac71652fdc4d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:32:11.936101 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:11.936075 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "263f8ce6-7a53-4ba0-808d-ac71652fdc4d" (UID: "263f8ce6-7a53-4ba0-808d-ac71652fdc4d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:32:12.029377 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:12.029351 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-ca-trust-extracted\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:32:12.029489 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:12.029380 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-bound-sa-token\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:32:12.029489 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:12.029396 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-image-registry-private-configuration\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:32:12.029489 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:12.029411 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-trusted-ca\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:32:12.029489 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:12.029425 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-installation-pull-secrets\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:32:12.029489 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:12.029438 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n4llr\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-kube-api-access-n4llr\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:32:12.029489 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:12.029451 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-tls\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:32:12.029489 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:12.029466 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/263f8ce6-7a53-4ba0-808d-ac71652fdc4d-registry-certificates\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:32:12.309142 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:12.309109 2574 generic.go:358] "Generic (PLEG): container finished" podID="263f8ce6-7a53-4ba0-808d-ac71652fdc4d" containerID="d68ad8470c23f20e60d2187581655c04a391538c9aa7afb534a7642e7b9653e9" exitCode=0 Apr 24 21:32:12.309287 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:12.309156 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65859fb559-pxvcf" event={"ID":"263f8ce6-7a53-4ba0-808d-ac71652fdc4d","Type":"ContainerDied","Data":"d68ad8470c23f20e60d2187581655c04a391538c9aa7afb534a7642e7b9653e9"} Apr 24 21:32:12.309287 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:12.309162 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65859fb559-pxvcf" Apr 24 21:32:12.309287 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:12.309177 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65859fb559-pxvcf" event={"ID":"263f8ce6-7a53-4ba0-808d-ac71652fdc4d","Type":"ContainerDied","Data":"fb5013c9b2aa9f942fae378054f9c2ebda38c2975e76ace116192c0e870ead54"} Apr 24 21:32:12.309287 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:12.309201 2574 scope.go:117] "RemoveContainer" containerID="d68ad8470c23f20e60d2187581655c04a391538c9aa7afb534a7642e7b9653e9" Apr 24 21:32:12.318751 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:12.318732 2574 scope.go:117] "RemoveContainer" containerID="d68ad8470c23f20e60d2187581655c04a391538c9aa7afb534a7642e7b9653e9" Apr 24 21:32:12.319011 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:32:12.318987 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d68ad8470c23f20e60d2187581655c04a391538c9aa7afb534a7642e7b9653e9\": container with ID starting with d68ad8470c23f20e60d2187581655c04a391538c9aa7afb534a7642e7b9653e9 not found: ID does not exist" containerID="d68ad8470c23f20e60d2187581655c04a391538c9aa7afb534a7642e7b9653e9" Apr 24 21:32:12.319063 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:12.319020 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d68ad8470c23f20e60d2187581655c04a391538c9aa7afb534a7642e7b9653e9"} err="failed to get container status \"d68ad8470c23f20e60d2187581655c04a391538c9aa7afb534a7642e7b9653e9\": rpc error: code = NotFound desc = could not find container \"d68ad8470c23f20e60d2187581655c04a391538c9aa7afb534a7642e7b9653e9\": container with ID starting with d68ad8470c23f20e60d2187581655c04a391538c9aa7afb534a7642e7b9653e9 not found: ID does not exist" Apr 24 21:32:12.329840 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:12.329822 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-65859fb559-pxvcf"] Apr 24 21:32:12.336200 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:12.336182 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-65859fb559-pxvcf"] Apr 24 21:32:12.715633 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:12.715600 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="263f8ce6-7a53-4ba0-808d-ac71652fdc4d" path="/var/lib/kubelet/pods/263f8ce6-7a53-4ba0-808d-ac71652fdc4d/volumes" Apr 24 21:32:56.723843 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:56.723808 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" podUID="9cd8aae5-3752-411b-9b22-2c9688bb5914" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:32:58.464139 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:58.464097 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs\") pod \"network-metrics-daemon-tdnnb\" (UID: \"fba6f53a-a544-4d53-ba11-2dd3b3259ed0\") " pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:32:58.466570 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:58.466548 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f53a-a544-4d53-ba11-2dd3b3259ed0-metrics-certs\") pod \"network-metrics-daemon-tdnnb\" (UID: \"fba6f53a-a544-4d53-ba11-2dd3b3259ed0\") " pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:32:58.616952 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:58.616923 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-flqk8\"" Apr 24 21:32:58.624936 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:58.624915 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tdnnb" Apr 24 21:32:58.740014 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:58.739986 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tdnnb"] Apr 24 21:32:58.743413 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:32:58.743378 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfba6f53a_a544_4d53_ba11_2dd3b3259ed0.slice/crio-18f30868a8fbfb0831796cc98fa8fc1d156a69cbec95ba082c23f7f8f7af17aa WatchSource:0}: Error finding container 18f30868a8fbfb0831796cc98fa8fc1d156a69cbec95ba082c23f7f8f7af17aa: Status 404 returned error can't find the container with id 18f30868a8fbfb0831796cc98fa8fc1d156a69cbec95ba082c23f7f8f7af17aa Apr 24 21:32:59.440622 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:32:59.440581 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tdnnb" event={"ID":"fba6f53a-a544-4d53-ba11-2dd3b3259ed0","Type":"ContainerStarted","Data":"18f30868a8fbfb0831796cc98fa8fc1d156a69cbec95ba082c23f7f8f7af17aa"} Apr 24 21:33:00.446543 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:00.446506 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tdnnb" event={"ID":"fba6f53a-a544-4d53-ba11-2dd3b3259ed0","Type":"ContainerStarted","Data":"8790482524d93b4c51b50fe2105583fefbd957f32693d23f284405627e60d0b6"} Apr 24 21:33:00.446543 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:00.446547 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tdnnb" event={"ID":"fba6f53a-a544-4d53-ba11-2dd3b3259ed0","Type":"ContainerStarted","Data":"0cae967b1714866c87a015b00e20dd425d127c03b0d4d045aa72dcacb5c8ead2"} Apr 24 21:33:00.465026 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:00.464802 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tdnnb" podStartSLOduration=253.447862933 podStartE2EDuration="4m14.46478532s" podCreationTimestamp="2026-04-24 21:28:46 +0000 UTC" firstStartedPulling="2026-04-24 21:32:58.745313085 +0000 UTC m=+252.616290562" lastFinishedPulling="2026-04-24 21:32:59.762235458 +0000 UTC m=+253.633212949" observedRunningTime="2026-04-24 21:33:00.464462859 +0000 UTC m=+254.335440359" watchObservedRunningTime="2026-04-24 21:33:00.46478532 +0000 UTC m=+254.335762820" Apr 24 21:33:01.893153 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:01.893120 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:01.915874 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:01.915849 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:02.467009 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:02.466985 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:06.724295 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:06.724255 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" podUID="9cd8aae5-3752-411b-9b22-2c9688bb5914" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:33:15.662163 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:15.662124 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:33:15.662672 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:15.662615 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="alertmanager" containerID="cri-o://d07221168b05a275efd1d3997883eead4db5898a21e83566a54c8d4f2f9cde28" gracePeriod=120 Apr 24 21:33:15.662873 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:15.662691 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="kube-rbac-proxy-metric" containerID="cri-o://89c529c7b38b2597ff355acca89f502079a2cdd7d092129bc0f0a44e5175a028" gracePeriod=120 Apr 24 21:33:15.662873 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:15.662724 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="kube-rbac-proxy-web" containerID="cri-o://c4fdbdc3751b5ca6a708004526fc103036af3f3ed3b81da3db81a3ce4f161493" gracePeriod=120 Apr 24 21:33:15.662873 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:15.662759 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="kube-rbac-proxy" containerID="cri-o://dec5e88614a90574a12094b8b8e8cfdaad6d8d5ebf004cbeef6bf81ffa3e5f8e" gracePeriod=120 Apr 24 21:33:15.662873 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:15.662789 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="prom-label-proxy" containerID="cri-o://b7b4f684313d73cdbba7ebafb993d64a04331e0bbab58fa39291a4156b3d1a4c" gracePeriod=120 Apr 24 21:33:15.662873 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:15.662756 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="config-reloader" containerID="cri-o://bbd4f41c55a91e4156336dc30b2a0c200dd3d6c69d2d2d89d45c506aadde6019" gracePeriod=120 Apr 24 21:33:16.501488 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.501450 2574 generic.go:358] "Generic (PLEG): container finished" podID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerID="b7b4f684313d73cdbba7ebafb993d64a04331e0bbab58fa39291a4156b3d1a4c" exitCode=0 Apr 24 21:33:16.501488 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.501481 2574 generic.go:358] "Generic (PLEG): container finished" podID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerID="89c529c7b38b2597ff355acca89f502079a2cdd7d092129bc0f0a44e5175a028" exitCode=0 Apr 24 21:33:16.501488 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.501489 2574 generic.go:358] "Generic (PLEG): container finished" podID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerID="dec5e88614a90574a12094b8b8e8cfdaad6d8d5ebf004cbeef6bf81ffa3e5f8e" exitCode=0 Apr 24 21:33:16.501769 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.501499 2574 generic.go:358] "Generic (PLEG): container finished" podID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerID="bbd4f41c55a91e4156336dc30b2a0c200dd3d6c69d2d2d89d45c506aadde6019" exitCode=0 Apr 24 21:33:16.501769 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.501508 2574 generic.go:358] "Generic (PLEG): container finished" podID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerID="d07221168b05a275efd1d3997883eead4db5898a21e83566a54c8d4f2f9cde28" exitCode=0 Apr 24 21:33:16.501769 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.501521 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"637b37cc-d73f-40aa-a2ce-c867a036839e","Type":"ContainerDied","Data":"b7b4f684313d73cdbba7ebafb993d64a04331e0bbab58fa39291a4156b3d1a4c"} Apr 24 21:33:16.501769 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.501563 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"637b37cc-d73f-40aa-a2ce-c867a036839e","Type":"ContainerDied","Data":"89c529c7b38b2597ff355acca89f502079a2cdd7d092129bc0f0a44e5175a028"} Apr 24 21:33:16.501769 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.501577 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"637b37cc-d73f-40aa-a2ce-c867a036839e","Type":"ContainerDied","Data":"dec5e88614a90574a12094b8b8e8cfdaad6d8d5ebf004cbeef6bf81ffa3e5f8e"} Apr 24 21:33:16.501769 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.501589 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"637b37cc-d73f-40aa-a2ce-c867a036839e","Type":"ContainerDied","Data":"bbd4f41c55a91e4156336dc30b2a0c200dd3d6c69d2d2d89d45c506aadde6019"} Apr 24 21:33:16.501769 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.501604 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"637b37cc-d73f-40aa-a2ce-c867a036839e","Type":"ContainerDied","Data":"d07221168b05a275efd1d3997883eead4db5898a21e83566a54c8d4f2f9cde28"} Apr 24 21:33:16.723286 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.723256 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" podUID="9cd8aae5-3752-411b-9b22-2c9688bb5914" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:33:16.723707 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.723311 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" Apr 24 21:33:16.723707 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.723689 2574 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"96b95c57021f3706fac1e3852e1fb3e3729e72a43ed41762367be8e26fd63399"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 21:33:16.723793 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.723723 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" podUID="9cd8aae5-3752-411b-9b22-2c9688bb5914" containerName="service-proxy" containerID="cri-o://96b95c57021f3706fac1e3852e1fb3e3729e72a43ed41762367be8e26fd63399" gracePeriod=30 Apr 24 21:33:16.955455 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.955432 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:16.992944 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.992418 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/637b37cc-d73f-40aa-a2ce-c867a036839e-config-out\") pod \"637b37cc-d73f-40aa-a2ce-c867a036839e\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " Apr 24 21:33:16.992944 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.992493 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/637b37cc-d73f-40aa-a2ce-c867a036839e-alertmanager-trusted-ca-bundle\") pod \"637b37cc-d73f-40aa-a2ce-c867a036839e\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " Apr 24 21:33:16.992944 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.992567 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-kube-rbac-proxy\") pod \"637b37cc-d73f-40aa-a2ce-c867a036839e\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " Apr 24 21:33:16.992944 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.992620 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/637b37cc-d73f-40aa-a2ce-c867a036839e-tls-assets\") pod \"637b37cc-d73f-40aa-a2ce-c867a036839e\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " Apr 24 21:33:16.992944 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.992655 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/637b37cc-d73f-40aa-a2ce-c867a036839e-alertmanager-main-db\") pod \"637b37cc-d73f-40aa-a2ce-c867a036839e\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " Apr 24 21:33:16.992944 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.992683 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-main-tls\") pod \"637b37cc-d73f-40aa-a2ce-c867a036839e\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " Apr 24 21:33:16.992944 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.992710 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxgss\" (UniqueName: \"kubernetes.io/projected/637b37cc-d73f-40aa-a2ce-c867a036839e-kube-api-access-xxgss\") pod \"637b37cc-d73f-40aa-a2ce-c867a036839e\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " Apr 24 21:33:16.992944 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.992751 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-cluster-tls-config\") pod \"637b37cc-d73f-40aa-a2ce-c867a036839e\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " Apr 24 21:33:16.992944 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.992780 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-kube-rbac-proxy-web\") pod \"637b37cc-d73f-40aa-a2ce-c867a036839e\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " Apr 24 21:33:16.992944 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.992807 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-config-volume\") pod \"637b37cc-d73f-40aa-a2ce-c867a036839e\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " Apr 24 21:33:16.992944 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.992842 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/637b37cc-d73f-40aa-a2ce-c867a036839e-metrics-client-ca\") pod \"637b37cc-d73f-40aa-a2ce-c867a036839e\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " Apr 24 21:33:16.992944 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.992865 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-web-config\") pod \"637b37cc-d73f-40aa-a2ce-c867a036839e\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " Apr 24 21:33:16.992944 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.992893 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"637b37cc-d73f-40aa-a2ce-c867a036839e\" (UID: \"637b37cc-d73f-40aa-a2ce-c867a036839e\") " Apr 24 21:33:16.994162 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.994128 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/637b37cc-d73f-40aa-a2ce-c867a036839e-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "637b37cc-d73f-40aa-a2ce-c867a036839e" (UID: "637b37cc-d73f-40aa-a2ce-c867a036839e"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:16.995041 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.994933 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/637b37cc-d73f-40aa-a2ce-c867a036839e-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "637b37cc-d73f-40aa-a2ce-c867a036839e" (UID: "637b37cc-d73f-40aa-a2ce-c867a036839e"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:33:16.995041 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.995006 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/637b37cc-d73f-40aa-a2ce-c867a036839e-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "637b37cc-d73f-40aa-a2ce-c867a036839e" (UID: "637b37cc-d73f-40aa-a2ce-c867a036839e"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:33:16.995041 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.995014 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "637b37cc-d73f-40aa-a2ce-c867a036839e" (UID: "637b37cc-d73f-40aa-a2ce-c867a036839e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:16.995758 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.995729 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/637b37cc-d73f-40aa-a2ce-c867a036839e-config-out" (OuterVolumeSpecName: "config-out") pod "637b37cc-d73f-40aa-a2ce-c867a036839e" (UID: "637b37cc-d73f-40aa-a2ce-c867a036839e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:16.997137 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.997038 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-config-volume" (OuterVolumeSpecName: "config-volume") pod "637b37cc-d73f-40aa-a2ce-c867a036839e" (UID: "637b37cc-d73f-40aa-a2ce-c867a036839e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:16.997137 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.997108 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "637b37cc-d73f-40aa-a2ce-c867a036839e" (UID: "637b37cc-d73f-40aa-a2ce-c867a036839e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:16.997455 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.997421 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "637b37cc-d73f-40aa-a2ce-c867a036839e" (UID: "637b37cc-d73f-40aa-a2ce-c867a036839e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:16.997656 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.997630 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/637b37cc-d73f-40aa-a2ce-c867a036839e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "637b37cc-d73f-40aa-a2ce-c867a036839e" (UID: "637b37cc-d73f-40aa-a2ce-c867a036839e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:16.998501 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.998469 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "637b37cc-d73f-40aa-a2ce-c867a036839e" (UID: "637b37cc-d73f-40aa-a2ce-c867a036839e"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:16.998819 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:16.998798 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/637b37cc-d73f-40aa-a2ce-c867a036839e-kube-api-access-xxgss" (OuterVolumeSpecName: "kube-api-access-xxgss") pod "637b37cc-d73f-40aa-a2ce-c867a036839e" (UID: "637b37cc-d73f-40aa-a2ce-c867a036839e"). InnerVolumeSpecName "kube-api-access-xxgss". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:17.000538 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.000512 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "637b37cc-d73f-40aa-a2ce-c867a036839e" (UID: "637b37cc-d73f-40aa-a2ce-c867a036839e"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:17.007248 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.007199 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-web-config" (OuterVolumeSpecName: "web-config") pod "637b37cc-d73f-40aa-a2ce-c867a036839e" (UID: "637b37cc-d73f-40aa-a2ce-c867a036839e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:17.094418 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.094369 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/637b37cc-d73f-40aa-a2ce-c867a036839e-tls-assets\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:17.094418 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.094389 2574 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/637b37cc-d73f-40aa-a2ce-c867a036839e-alertmanager-main-db\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:17.094418 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.094399 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-main-tls\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:17.094418 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.094408 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xxgss\" (UniqueName: \"kubernetes.io/projected/637b37cc-d73f-40aa-a2ce-c867a036839e-kube-api-access-xxgss\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:17.094418 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.094416 2574 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-cluster-tls-config\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:17.094616 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.094426 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:17.094616 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.094435 2574 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-config-volume\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:17.094616 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.094443 2574 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/637b37cc-d73f-40aa-a2ce-c867a036839e-metrics-client-ca\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:17.094616 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.094452 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-web-config\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:17.094616 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.094460 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:17.094616 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.094469 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/637b37cc-d73f-40aa-a2ce-c867a036839e-config-out\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:17.094616 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.094477 2574 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/637b37cc-d73f-40aa-a2ce-c867a036839e-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:17.094616 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.094487 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/637b37cc-d73f-40aa-a2ce-c867a036839e-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:17.506274 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.506238 2574 generic.go:358] "Generic (PLEG): container finished" podID="9cd8aae5-3752-411b-9b22-2c9688bb5914" containerID="96b95c57021f3706fac1e3852e1fb3e3729e72a43ed41762367be8e26fd63399" exitCode=2 Apr 24 21:33:17.506274 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.506257 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" event={"ID":"9cd8aae5-3752-411b-9b22-2c9688bb5914","Type":"ContainerDied","Data":"96b95c57021f3706fac1e3852e1fb3e3729e72a43ed41762367be8e26fd63399"} Apr 24 21:33:17.506484 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.506293 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9b5b999d-9nl7l" event={"ID":"9cd8aae5-3752-411b-9b22-2c9688bb5914","Type":"ContainerStarted","Data":"1ef6817d108ff8001c282491e203d04377dfce27f729d26bf101845c4f35120c"} Apr 24 21:33:17.508815 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.508786 2574 generic.go:358] "Generic (PLEG): container finished" podID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerID="c4fdbdc3751b5ca6a708004526fc103036af3f3ed3b81da3db81a3ce4f161493" exitCode=0 Apr 24 21:33:17.508919 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.508867 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"637b37cc-d73f-40aa-a2ce-c867a036839e","Type":"ContainerDied","Data":"c4fdbdc3751b5ca6a708004526fc103036af3f3ed3b81da3db81a3ce4f161493"} Apr 24 21:33:17.508919 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.508898 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"637b37cc-d73f-40aa-a2ce-c867a036839e","Type":"ContainerDied","Data":"76ced6527a754993a925ccf27357b421cdc6ab575e37f005e758a9188ac83970"} Apr 24 21:33:17.508919 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.508909 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.509014 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.508915 2574 scope.go:117] "RemoveContainer" containerID="b7b4f684313d73cdbba7ebafb993d64a04331e0bbab58fa39291a4156b3d1a4c" Apr 24 21:33:17.515824 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.515808 2574 scope.go:117] "RemoveContainer" containerID="89c529c7b38b2597ff355acca89f502079a2cdd7d092129bc0f0a44e5175a028" Apr 24 21:33:17.521949 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.521935 2574 scope.go:117] "RemoveContainer" containerID="dec5e88614a90574a12094b8b8e8cfdaad6d8d5ebf004cbeef6bf81ffa3e5f8e" Apr 24 21:33:17.528014 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.527996 2574 scope.go:117] "RemoveContainer" containerID="c4fdbdc3751b5ca6a708004526fc103036af3f3ed3b81da3db81a3ce4f161493" Apr 24 21:33:17.534037 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.534021 2574 scope.go:117] "RemoveContainer" containerID="bbd4f41c55a91e4156336dc30b2a0c200dd3d6c69d2d2d89d45c506aadde6019" Apr 24 21:33:17.540312 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.540297 2574 scope.go:117] "RemoveContainer" containerID="d07221168b05a275efd1d3997883eead4db5898a21e83566a54c8d4f2f9cde28" Apr 24 21:33:17.546198 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.546179 2574 scope.go:117] "RemoveContainer" containerID="c2539c3186108c5e40af4c1e10bda06907058d704cb0ef3b03349a3f62c2f8a5" Apr 24 21:33:17.552019 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.552006 2574 scope.go:117] "RemoveContainer" containerID="b7b4f684313d73cdbba7ebafb993d64a04331e0bbab58fa39291a4156b3d1a4c" Apr 24 21:33:17.552276 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:33:17.552255 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7b4f684313d73cdbba7ebafb993d64a04331e0bbab58fa39291a4156b3d1a4c\": container with ID starting with b7b4f684313d73cdbba7ebafb993d64a04331e0bbab58fa39291a4156b3d1a4c not found: ID does not exist" containerID="b7b4f684313d73cdbba7ebafb993d64a04331e0bbab58fa39291a4156b3d1a4c" Apr 24 21:33:17.552326 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.552284 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b4f684313d73cdbba7ebafb993d64a04331e0bbab58fa39291a4156b3d1a4c"} err="failed to get container status \"b7b4f684313d73cdbba7ebafb993d64a04331e0bbab58fa39291a4156b3d1a4c\": rpc error: code = NotFound desc = could not find container \"b7b4f684313d73cdbba7ebafb993d64a04331e0bbab58fa39291a4156b3d1a4c\": container with ID starting with b7b4f684313d73cdbba7ebafb993d64a04331e0bbab58fa39291a4156b3d1a4c not found: ID does not exist" Apr 24 21:33:17.552326 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.552299 2574 scope.go:117] "RemoveContainer" containerID="89c529c7b38b2597ff355acca89f502079a2cdd7d092129bc0f0a44e5175a028" Apr 24 21:33:17.552507 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:33:17.552490 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89c529c7b38b2597ff355acca89f502079a2cdd7d092129bc0f0a44e5175a028\": container with ID starting with 89c529c7b38b2597ff355acca89f502079a2cdd7d092129bc0f0a44e5175a028 not found: ID does not exist" containerID="89c529c7b38b2597ff355acca89f502079a2cdd7d092129bc0f0a44e5175a028" Apr 24 21:33:17.552567 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.552516 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c529c7b38b2597ff355acca89f502079a2cdd7d092129bc0f0a44e5175a028"} err="failed to get container status \"89c529c7b38b2597ff355acca89f502079a2cdd7d092129bc0f0a44e5175a028\": rpc error: code = NotFound desc = could not find container \"89c529c7b38b2597ff355acca89f502079a2cdd7d092129bc0f0a44e5175a028\": container with ID starting with 89c529c7b38b2597ff355acca89f502079a2cdd7d092129bc0f0a44e5175a028 not found: ID does not exist" Apr 24 21:33:17.552567 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.552538 2574 scope.go:117] "RemoveContainer" containerID="dec5e88614a90574a12094b8b8e8cfdaad6d8d5ebf004cbeef6bf81ffa3e5f8e" Apr 24 21:33:17.552750 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:33:17.552732 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dec5e88614a90574a12094b8b8e8cfdaad6d8d5ebf004cbeef6bf81ffa3e5f8e\": container with ID starting with dec5e88614a90574a12094b8b8e8cfdaad6d8d5ebf004cbeef6bf81ffa3e5f8e not found: ID does not exist" containerID="dec5e88614a90574a12094b8b8e8cfdaad6d8d5ebf004cbeef6bf81ffa3e5f8e" Apr 24 21:33:17.552785 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.552756 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dec5e88614a90574a12094b8b8e8cfdaad6d8d5ebf004cbeef6bf81ffa3e5f8e"} err="failed to get container status \"dec5e88614a90574a12094b8b8e8cfdaad6d8d5ebf004cbeef6bf81ffa3e5f8e\": rpc error: code = NotFound desc = could not find container \"dec5e88614a90574a12094b8b8e8cfdaad6d8d5ebf004cbeef6bf81ffa3e5f8e\": container with ID starting with dec5e88614a90574a12094b8b8e8cfdaad6d8d5ebf004cbeef6bf81ffa3e5f8e not found: ID does not exist" Apr 24 21:33:17.552785 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.552771 2574 scope.go:117] "RemoveContainer" containerID="c4fdbdc3751b5ca6a708004526fc103036af3f3ed3b81da3db81a3ce4f161493" Apr 24 21:33:17.553013 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:33:17.552997 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4fdbdc3751b5ca6a708004526fc103036af3f3ed3b81da3db81a3ce4f161493\": container with ID starting with c4fdbdc3751b5ca6a708004526fc103036af3f3ed3b81da3db81a3ce4f161493 not found: ID does not exist" containerID="c4fdbdc3751b5ca6a708004526fc103036af3f3ed3b81da3db81a3ce4f161493" Apr 24 21:33:17.553063 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.553016 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4fdbdc3751b5ca6a708004526fc103036af3f3ed3b81da3db81a3ce4f161493"} err="failed to get container status \"c4fdbdc3751b5ca6a708004526fc103036af3f3ed3b81da3db81a3ce4f161493\": rpc error: code = NotFound desc = could not find container \"c4fdbdc3751b5ca6a708004526fc103036af3f3ed3b81da3db81a3ce4f161493\": container with ID starting with c4fdbdc3751b5ca6a708004526fc103036af3f3ed3b81da3db81a3ce4f161493 not found: ID does not exist" Apr 24 21:33:17.553063 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.553030 2574 scope.go:117] "RemoveContainer" containerID="bbd4f41c55a91e4156336dc30b2a0c200dd3d6c69d2d2d89d45c506aadde6019" Apr 24 21:33:17.553253 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:33:17.553218 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbd4f41c55a91e4156336dc30b2a0c200dd3d6c69d2d2d89d45c506aadde6019\": container with ID starting with bbd4f41c55a91e4156336dc30b2a0c200dd3d6c69d2d2d89d45c506aadde6019 not found: ID does not exist" containerID="bbd4f41c55a91e4156336dc30b2a0c200dd3d6c69d2d2d89d45c506aadde6019" Apr 24 21:33:17.553300 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.553258 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbd4f41c55a91e4156336dc30b2a0c200dd3d6c69d2d2d89d45c506aadde6019"} err="failed to get container status \"bbd4f41c55a91e4156336dc30b2a0c200dd3d6c69d2d2d89d45c506aadde6019\": rpc error: code = NotFound desc = could not find container \"bbd4f41c55a91e4156336dc30b2a0c200dd3d6c69d2d2d89d45c506aadde6019\": container with ID starting with bbd4f41c55a91e4156336dc30b2a0c200dd3d6c69d2d2d89d45c506aadde6019 not found: ID does not exist" Apr 24 21:33:17.553300 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.553274 2574 scope.go:117] "RemoveContainer" containerID="d07221168b05a275efd1d3997883eead4db5898a21e83566a54c8d4f2f9cde28" Apr 24 21:33:17.553515 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:33:17.553496 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d07221168b05a275efd1d3997883eead4db5898a21e83566a54c8d4f2f9cde28\": container with ID starting with d07221168b05a275efd1d3997883eead4db5898a21e83566a54c8d4f2f9cde28 not found: ID does not exist" containerID="d07221168b05a275efd1d3997883eead4db5898a21e83566a54c8d4f2f9cde28" Apr 24 21:33:17.553562 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.553519 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d07221168b05a275efd1d3997883eead4db5898a21e83566a54c8d4f2f9cde28"} err="failed to get container status \"d07221168b05a275efd1d3997883eead4db5898a21e83566a54c8d4f2f9cde28\": rpc error: code = NotFound desc = could not find container \"d07221168b05a275efd1d3997883eead4db5898a21e83566a54c8d4f2f9cde28\": container with ID starting with d07221168b05a275efd1d3997883eead4db5898a21e83566a54c8d4f2f9cde28 not found: ID does not exist" Apr 24 21:33:17.553562 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.553532 2574 scope.go:117] "RemoveContainer" containerID="c2539c3186108c5e40af4c1e10bda06907058d704cb0ef3b03349a3f62c2f8a5" Apr 24 21:33:17.553735 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:33:17.553717 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2539c3186108c5e40af4c1e10bda06907058d704cb0ef3b03349a3f62c2f8a5\": container with ID starting with c2539c3186108c5e40af4c1e10bda06907058d704cb0ef3b03349a3f62c2f8a5 not found: ID does not exist" containerID="c2539c3186108c5e40af4c1e10bda06907058d704cb0ef3b03349a3f62c2f8a5" Apr 24 21:33:17.553814 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.553737 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2539c3186108c5e40af4c1e10bda06907058d704cb0ef3b03349a3f62c2f8a5"} err="failed to get container status \"c2539c3186108c5e40af4c1e10bda06907058d704cb0ef3b03349a3f62c2f8a5\": rpc error: code = NotFound desc = could not find container \"c2539c3186108c5e40af4c1e10bda06907058d704cb0ef3b03349a3f62c2f8a5\": container with ID starting with c2539c3186108c5e40af4c1e10bda06907058d704cb0ef3b03349a3f62c2f8a5 not found: ID does not exist" Apr 24 21:33:17.559625 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.559604 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:33:17.575508 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.575483 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:33:17.639178 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639157 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:33:17.639461 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639436 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="config-reloader" Apr 24 21:33:17.639461 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639450 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="config-reloader" Apr 24 21:33:17.639461 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639459 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="init-config-reloader" Apr 24 21:33:17.639461 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639465 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="init-config-reloader" Apr 24 21:33:17.639599 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639472 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="alertmanager" Apr 24 21:33:17.639599 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639478 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="alertmanager" Apr 24 21:33:17.639599 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639488 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="kube-rbac-proxy-web" Apr 24 21:33:17.639599 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639493 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="kube-rbac-proxy-web" Apr 24 21:33:17.639599 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639500 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="kube-rbac-proxy" Apr 24 21:33:17.639599 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639504 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="kube-rbac-proxy" Apr 24 21:33:17.639599 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639510 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="kube-rbac-proxy-metric" Apr 24 21:33:17.639599 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639515 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="kube-rbac-proxy-metric" Apr 24 21:33:17.639599 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639522 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="263f8ce6-7a53-4ba0-808d-ac71652fdc4d" containerName="registry" Apr 24 21:33:17.639599 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639526 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="263f8ce6-7a53-4ba0-808d-ac71652fdc4d" containerName="registry" Apr 24 21:33:17.639599 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639539 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="prom-label-proxy" Apr 24 21:33:17.639599 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639543 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="prom-label-proxy" Apr 24 21:33:17.639599 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639583 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="263f8ce6-7a53-4ba0-808d-ac71652fdc4d" containerName="registry" Apr 24 21:33:17.639599 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639591 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="kube-rbac-proxy-metric" Apr 24 21:33:17.639599 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639597 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="alertmanager" Apr 24 21:33:17.639599 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639603 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="config-reloader" Apr 24 21:33:17.640030 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639609 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="kube-rbac-proxy-web" Apr 24 21:33:17.640030 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639615 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="kube-rbac-proxy" Apr 24 21:33:17.640030 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.639621 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" containerName="prom-label-proxy" Apr 24 21:33:17.644488 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.644474 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.646877 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.646857 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 21:33:17.646968 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.646878 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 21:33:17.647154 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.647132 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 21:33:17.647221 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.647144 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 21:33:17.647356 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.647212 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 21:33:17.647356 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.647163 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 21:33:17.647356 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.647296 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-7pqz5\"" Apr 24 21:33:17.647548 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.647533 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 21:33:17.647622 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.647609 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 21:33:17.652928 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.652907 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 21:33:17.657275 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.657254 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:33:17.697713 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.697688 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23342931-f34f-4e4b-92c6-85da5c12481d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.697800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.697717 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/23342931-f34f-4e4b-92c6-85da5c12481d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.697800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.697735 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23342931-f34f-4e4b-92c6-85da5c12481d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.697800 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.697755 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.697939 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.697831 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/23342931-f34f-4e4b-92c6-85da5c12481d-config-out\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.697939 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.697891 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-config-volume\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.698016 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.697937 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/23342931-f34f-4e4b-92c6-85da5c12481d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.698016 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.697993 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.698094 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.698025 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.698094 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.698044 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.698094 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.698060 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.698094 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.698078 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgbg6\" (UniqueName: \"kubernetes.io/projected/23342931-f34f-4e4b-92c6-85da5c12481d-kube-api-access-vgbg6\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.698094 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.698093 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-web-config\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.799330 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.799270 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.799330 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.799318 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.799722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.799335 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.799722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.799353 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.799722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.799375 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgbg6\" (UniqueName: \"kubernetes.io/projected/23342931-f34f-4e4b-92c6-85da5c12481d-kube-api-access-vgbg6\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.799722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.799411 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-web-config\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.799722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.799449 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23342931-f34f-4e4b-92c6-85da5c12481d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.799722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.799491 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/23342931-f34f-4e4b-92c6-85da5c12481d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.799722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.799512 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23342931-f34f-4e4b-92c6-85da5c12481d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.799722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.799544 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.799722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.799577 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/23342931-f34f-4e4b-92c6-85da5c12481d-config-out\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.799722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.799607 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-config-volume\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.799722 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.799662 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/23342931-f34f-4e4b-92c6-85da5c12481d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.800265 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.800065 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/23342931-f34f-4e4b-92c6-85da5c12481d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.800814 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.800787 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23342931-f34f-4e4b-92c6-85da5c12481d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.800931 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.800908 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23342931-f34f-4e4b-92c6-85da5c12481d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.802335 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.802312 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.802492 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.802469 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.802572 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.802508 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.802971 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.802949 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.803080 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.803007 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.803080 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.803038 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-web-config\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.803621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.803602 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/23342931-f34f-4e4b-92c6-85da5c12481d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.803821 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.803807 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/23342931-f34f-4e4b-92c6-85da5c12481d-config-volume\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.804065 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.804052 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/23342931-f34f-4e4b-92c6-85da5c12481d-config-out\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.808313 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.808295 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgbg6\" (UniqueName: \"kubernetes.io/projected/23342931-f34f-4e4b-92c6-85da5c12481d-kube-api-access-vgbg6\") pod \"alertmanager-main-0\" (UID: \"23342931-f34f-4e4b-92c6-85da5c12481d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:17.953428 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:17.953405 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:33:18.080921 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:18.080874 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:33:18.084444 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:33:18.084402 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23342931_f34f_4e4b_92c6_85da5c12481d.slice/crio-da3d64345b54bb62d7f3efa192c0146c0978df23145d0cd85453f0bfcb2ab45a WatchSource:0}: Error finding container da3d64345b54bb62d7f3efa192c0146c0978df23145d0cd85453f0bfcb2ab45a: Status 404 returned error can't find the container with id da3d64345b54bb62d7f3efa192c0146c0978df23145d0cd85453f0bfcb2ab45a Apr 24 21:33:18.513794 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:18.513749 2574 generic.go:358] "Generic (PLEG): container finished" podID="23342931-f34f-4e4b-92c6-85da5c12481d" containerID="3d1c49d96cc9e19a15ca8d689f39d6d9569c3e3a27780e44cee6ce35c75f60b1" exitCode=0 Apr 24 21:33:18.513960 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:18.513851 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23342931-f34f-4e4b-92c6-85da5c12481d","Type":"ContainerDied","Data":"3d1c49d96cc9e19a15ca8d689f39d6d9569c3e3a27780e44cee6ce35c75f60b1"} Apr 24 21:33:18.513960 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:18.513892 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23342931-f34f-4e4b-92c6-85da5c12481d","Type":"ContainerStarted","Data":"da3d64345b54bb62d7f3efa192c0146c0978df23145d0cd85453f0bfcb2ab45a"} Apr 24 21:33:18.717251 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:18.716910 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="637b37cc-d73f-40aa-a2ce-c867a036839e" path="/var/lib/kubelet/pods/637b37cc-d73f-40aa-a2ce-c867a036839e/volumes" Apr 24 21:33:19.521683 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.521646 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23342931-f34f-4e4b-92c6-85da5c12481d","Type":"ContainerStarted","Data":"8584401ba4edb801ee3d7cbd6077999795932bec1cdc74a58aeef3aeb58b4bba"} Apr 24 21:33:19.521683 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.521685 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23342931-f34f-4e4b-92c6-85da5c12481d","Type":"ContainerStarted","Data":"6521dec324deb9a80e713c43a43ca01f728795800c7c05a9c23f6015febdb503"} Apr 24 21:33:19.522084 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.521697 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23342931-f34f-4e4b-92c6-85da5c12481d","Type":"ContainerStarted","Data":"d33096e7c1cf75103124cccdcd03a4dee0b70c264d9fcea8950340ca47a292c8"} Apr 24 21:33:19.522084 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.521708 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23342931-f34f-4e4b-92c6-85da5c12481d","Type":"ContainerStarted","Data":"2f6aeac46e1a2b352b81e9681ea7900457fdac8f5878f1e64f38654bedcd4605"} Apr 24 21:33:19.522084 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.521716 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23342931-f34f-4e4b-92c6-85da5c12481d","Type":"ContainerStarted","Data":"18a0f99862e0e4a973cb5e6a05ea50274c9daf16fcc0d91edd5ce381f43c9cfb"} Apr 24 21:33:19.522084 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.521724 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"23342931-f34f-4e4b-92c6-85da5c12481d","Type":"ContainerStarted","Data":"2fecae2069a48b8c8806bbed39a943c6f51a6beebc63526bc3414d5ef26e9d76"} Apr 24 21:33:19.549988 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.549941 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.549928783 podStartE2EDuration="2.549928783s" podCreationTimestamp="2026-04-24 21:33:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:33:19.547679124 +0000 UTC m=+273.418656624" watchObservedRunningTime="2026-04-24 21:33:19.549928783 +0000 UTC m=+273.420906283" Apr 24 21:33:19.705672 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.705640 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6569cfb4f9-h2stm"] Apr 24 21:33:19.708841 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.708822 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.711140 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.711119 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-s9rvg\"" Apr 24 21:33:19.711140 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.711131 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 21:33:19.711370 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.711357 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 21:33:19.711595 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.711578 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 21:33:19.711667 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.711622 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 21:33:19.711667 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.711634 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 21:33:19.716340 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.716319 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 21:33:19.727988 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.727969 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6569cfb4f9-h2stm"] Apr 24 21:33:19.815577 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.815527 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-federate-client-tls\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.815577 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.815559 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw45j\" (UniqueName: \"kubernetes.io/projected/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-kube-api-access-kw45j\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.815708 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.815581 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-metrics-client-ca\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.815708 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.815603 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-secret-telemeter-client\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.815708 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.815665 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-telemeter-client-tls\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.815708 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.815691 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.815867 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.815716 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.815867 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.815737 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-serving-certs-ca-bundle\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.916497 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.916470 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-secret-telemeter-client\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.916589 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.916501 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-telemeter-client-tls\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.916589 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.916520 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.916589 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.916538 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.916589 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.916559 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-serving-certs-ca-bundle\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.916793 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.916608 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-federate-client-tls\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.916793 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.916632 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kw45j\" (UniqueName: \"kubernetes.io/projected/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-kube-api-access-kw45j\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.916793 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.916660 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-metrics-client-ca\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.917395 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.917374 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-serving-certs-ca-bundle\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.917499 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.917478 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.917548 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.917525 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-metrics-client-ca\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.919120 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.919099 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.919212 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.919197 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-federate-client-tls\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.919773 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.919756 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-telemeter-client-tls\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.919850 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.919811 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-secret-telemeter-client\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:19.925525 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:19.925504 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw45j\" (UniqueName: \"kubernetes.io/projected/0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180-kube-api-access-kw45j\") pod \"telemeter-client-6569cfb4f9-h2stm\" (UID: \"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180\") " pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:20.017892 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.017859 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" Apr 24 21:33:20.108210 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.108180 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:33:20.108901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.108651 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="prometheus" containerID="cri-o://9f6e8b04827930f579ed02910248c0f53de8d96049d28a7edd48c85f511f7223" gracePeriod=600 Apr 24 21:33:20.108901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.108689 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="kube-rbac-proxy" containerID="cri-o://54c5c6aeb417f8dcbef73dc63bbfa63a2ae49228581a66f0fa919bc224366ac4" gracePeriod=600 Apr 24 21:33:20.108901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.108740 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="kube-rbac-proxy-thanos" containerID="cri-o://1758c557f8edc74499e1e5f6886d0f75ef0319b5e6091ebf4e5cae3bd78e1fa8" gracePeriod=600 Apr 24 21:33:20.108901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.108815 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="config-reloader" containerID="cri-o://3b74c67778b208e9433acdfeba27bbce8cb7ce5fec3dfab52d6e59b7f71f1605" gracePeriod=600 Apr 24 21:33:20.108901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.108824 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="kube-rbac-proxy-web" containerID="cri-o://985687bb9d968e217885debb4265656f39a6910078302ccbc2bd4ef4d25c19dc" gracePeriod=600 Apr 24 21:33:20.108901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.108828 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="thanos-sidecar" containerID="cri-o://afdca7651dc0c71cfe93d11993a17bae2874d58e4e9b0b4cfc949a2c3caeab53" gracePeriod=600 Apr 24 21:33:20.146905 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.146882 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6569cfb4f9-h2stm"] Apr 24 21:33:20.149214 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:33:20.149187 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0efd4b5d_ceb1_4ef4_92cc_fcdbecb49180.slice/crio-b12845462ddaccad41c71f07ee33857e4140e9e2a409c7c62340aa7a22943c01 WatchSource:0}: Error finding container b12845462ddaccad41c71f07ee33857e4140e9e2a409c7c62340aa7a22943c01: Status 404 returned error can't find the container with id b12845462ddaccad41c71f07ee33857e4140e9e2a409c7c62340aa7a22943c01 Apr 24 21:33:20.527520 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.527491 2574 generic.go:358] "Generic (PLEG): container finished" podID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerID="1758c557f8edc74499e1e5f6886d0f75ef0319b5e6091ebf4e5cae3bd78e1fa8" exitCode=0 Apr 24 21:33:20.527520 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.527517 2574 generic.go:358] "Generic (PLEG): container finished" podID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerID="54c5c6aeb417f8dcbef73dc63bbfa63a2ae49228581a66f0fa919bc224366ac4" exitCode=0 Apr 24 21:33:20.527520 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.527525 2574 generic.go:358] "Generic (PLEG): container finished" podID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerID="afdca7651dc0c71cfe93d11993a17bae2874d58e4e9b0b4cfc949a2c3caeab53" exitCode=0 Apr 24 21:33:20.527903 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.527531 2574 generic.go:358] "Generic (PLEG): container finished" podID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerID="3b74c67778b208e9433acdfeba27bbce8cb7ce5fec3dfab52d6e59b7f71f1605" exitCode=0 Apr 24 21:33:20.527903 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.527540 2574 generic.go:358] "Generic (PLEG): container finished" podID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerID="9f6e8b04827930f579ed02910248c0f53de8d96049d28a7edd48c85f511f7223" exitCode=0 Apr 24 21:33:20.527903 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.527561 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2","Type":"ContainerDied","Data":"1758c557f8edc74499e1e5f6886d0f75ef0319b5e6091ebf4e5cae3bd78e1fa8"} Apr 24 21:33:20.527903 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.527596 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2","Type":"ContainerDied","Data":"54c5c6aeb417f8dcbef73dc63bbfa63a2ae49228581a66f0fa919bc224366ac4"} Apr 24 21:33:20.527903 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.527610 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2","Type":"ContainerDied","Data":"afdca7651dc0c71cfe93d11993a17bae2874d58e4e9b0b4cfc949a2c3caeab53"} Apr 24 21:33:20.527903 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.527624 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2","Type":"ContainerDied","Data":"3b74c67778b208e9433acdfeba27bbce8cb7ce5fec3dfab52d6e59b7f71f1605"} Apr 24 21:33:20.527903 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.527637 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2","Type":"ContainerDied","Data":"9f6e8b04827930f579ed02910248c0f53de8d96049d28a7edd48c85f511f7223"} Apr 24 21:33:20.528657 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:20.528629 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" event={"ID":"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180","Type":"ContainerStarted","Data":"b12845462ddaccad41c71f07ee33857e4140e9e2a409c7c62340aa7a22943c01"} Apr 24 21:33:21.363625 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.363603 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.428289 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428258 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-configmap-serving-certs-ca-bundle\") pod \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " Apr 24 21:33:21.428407 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428298 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-web-config\") pod \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " Apr 24 21:33:21.428407 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428328 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-config-out\") pod \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " Apr 24 21:33:21.428407 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428353 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-prometheus-trusted-ca-bundle\") pod \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " Apr 24 21:33:21.428407 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428382 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-prometheus-k8s-rulefiles-0\") pod \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " Apr 24 21:33:21.428587 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428410 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-prometheus-k8s-tls\") pod \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " Apr 24 21:33:21.428587 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428440 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-tls-assets\") pod \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " Apr 24 21:33:21.428587 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428463 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-prometheus-k8s-db\") pod \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " Apr 24 21:33:21.428587 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428489 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-metrics-client-certs\") pod \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " Apr 24 21:33:21.428587 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428545 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-configmap-metrics-client-ca\") pod \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " Apr 24 21:33:21.428587 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428570 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-config\") pod \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " Apr 24 21:33:21.428871 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428614 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2chzr\" (UniqueName: \"kubernetes.io/projected/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-kube-api-access-2chzr\") pod \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " Apr 24 21:33:21.428871 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428639 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-kube-rbac-proxy\") pod \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " Apr 24 21:33:21.428871 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428676 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-thanos-prometheus-http-client-file\") pod \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " Apr 24 21:33:21.428871 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428674 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" (UID: "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:33:21.428871 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428708 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-grpc-tls\") pod \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " Apr 24 21:33:21.428871 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428817 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " Apr 24 21:33:21.428871 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428854 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-configmap-kubelet-serving-ca-bundle\") pod \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " Apr 24 21:33:21.429176 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.428886 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\" (UID: \"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2\") " Apr 24 21:33:21.429176 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.429055 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" (UID: "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:33:21.429176 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.429168 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:21.429325 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.429190 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-prometheus-trusted-ca-bundle\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:21.429796 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.429702 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" (UID: "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:33:21.429913 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.429887 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" (UID: "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:33:21.429989 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.429968 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" (UID: "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:21.430626 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.430585 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" (UID: "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:33:21.433710 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.433672 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" (UID: "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:21.434775 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.434710 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" (UID: "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:21.435783 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.435755 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-config-out" (OuterVolumeSpecName: "config-out") pod "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" (UID: "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:21.435881 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.435850 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" (UID: "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:21.436048 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.436024 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-kube-api-access-2chzr" (OuterVolumeSpecName: "kube-api-access-2chzr") pod "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" (UID: "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2"). InnerVolumeSpecName "kube-api-access-2chzr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:21.436208 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.436179 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" (UID: "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:21.436498 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.436463 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" (UID: "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:21.436589 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.436533 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" (UID: "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:21.436656 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.436623 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" (UID: "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:21.436875 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.436846 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" (UID: "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:21.437079 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.437056 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-config" (OuterVolumeSpecName: "config") pod "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" (UID: "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:21.446661 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.446640 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-web-config" (OuterVolumeSpecName: "web-config") pod "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" (UID: "1ed03c7e-bf02-42a0-90e4-c8c8027b31c2"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:21.529897 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.529839 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:21.529897 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.529868 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-prometheus-k8s-tls\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:21.529897 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.529885 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-tls-assets\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:21.530342 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.529899 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-prometheus-k8s-db\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:21.530342 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.529914 2574 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-metrics-client-certs\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:21.530342 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.529928 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-configmap-metrics-client-ca\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:21.530342 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.529945 2574 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-config\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:21.530342 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.529960 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2chzr\" (UniqueName: \"kubernetes.io/projected/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-kube-api-access-2chzr\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:21.530342 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.529973 2574 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-kube-rbac-proxy\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:21.530342 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.529988 2574 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-thanos-prometheus-http-client-file\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:21.530342 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.530002 2574 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-grpc-tls\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:21.530342 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.530015 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:21.530342 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.530031 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:21.530342 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.530045 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:21.530342 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.530057 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-web-config\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:21.530342 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.530065 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2-config-out\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:33:21.537515 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.537492 2574 generic.go:358] "Generic (PLEG): container finished" podID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerID="985687bb9d968e217885debb4265656f39a6910078302ccbc2bd4ef4d25c19dc" exitCode=0 Apr 24 21:33:21.537613 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.537572 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2","Type":"ContainerDied","Data":"985687bb9d968e217885debb4265656f39a6910078302ccbc2bd4ef4d25c19dc"} Apr 24 21:33:21.537613 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.537610 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ed03c7e-bf02-42a0-90e4-c8c8027b31c2","Type":"ContainerDied","Data":"f5669a9fff98191503b63763758806b84c23b40f56b96f33bee8c365fc8a3dba"} Apr 24 21:33:21.537726 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.537621 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.537775 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.537626 2574 scope.go:117] "RemoveContainer" containerID="1758c557f8edc74499e1e5f6886d0f75ef0319b5e6091ebf4e5cae3bd78e1fa8" Apr 24 21:33:21.544742 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.544723 2574 scope.go:117] "RemoveContainer" containerID="54c5c6aeb417f8dcbef73dc63bbfa63a2ae49228581a66f0fa919bc224366ac4" Apr 24 21:33:21.551521 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.551500 2574 scope.go:117] "RemoveContainer" containerID="985687bb9d968e217885debb4265656f39a6910078302ccbc2bd4ef4d25c19dc" Apr 24 21:33:21.557862 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.557847 2574 scope.go:117] "RemoveContainer" containerID="afdca7651dc0c71cfe93d11993a17bae2874d58e4e9b0b4cfc949a2c3caeab53" Apr 24 21:33:21.564220 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.564203 2574 scope.go:117] "RemoveContainer" containerID="3b74c67778b208e9433acdfeba27bbce8cb7ce5fec3dfab52d6e59b7f71f1605" Apr 24 21:33:21.569770 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.569747 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:33:21.571274 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.571257 2574 scope.go:117] "RemoveContainer" containerID="9f6e8b04827930f579ed02910248c0f53de8d96049d28a7edd48c85f511f7223" Apr 24 21:33:21.578180 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.578159 2574 scope.go:117] "RemoveContainer" containerID="ab7015300691528ca626f2272af9ca1ca36b3d672f215e61ed49f42ab91b809b" Apr 24 21:33:21.585840 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.585811 2574 scope.go:117] "RemoveContainer" containerID="1758c557f8edc74499e1e5f6886d0f75ef0319b5e6091ebf4e5cae3bd78e1fa8" Apr 24 21:33:21.586122 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:33:21.586098 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1758c557f8edc74499e1e5f6886d0f75ef0319b5e6091ebf4e5cae3bd78e1fa8\": container with ID starting with 1758c557f8edc74499e1e5f6886d0f75ef0319b5e6091ebf4e5cae3bd78e1fa8 not found: ID does not exist" containerID="1758c557f8edc74499e1e5f6886d0f75ef0319b5e6091ebf4e5cae3bd78e1fa8" Apr 24 21:33:21.586204 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.586132 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1758c557f8edc74499e1e5f6886d0f75ef0319b5e6091ebf4e5cae3bd78e1fa8"} err="failed to get container status \"1758c557f8edc74499e1e5f6886d0f75ef0319b5e6091ebf4e5cae3bd78e1fa8\": rpc error: code = NotFound desc = could not find container \"1758c557f8edc74499e1e5f6886d0f75ef0319b5e6091ebf4e5cae3bd78e1fa8\": container with ID starting with 1758c557f8edc74499e1e5f6886d0f75ef0319b5e6091ebf4e5cae3bd78e1fa8 not found: ID does not exist" Apr 24 21:33:21.586204 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.586157 2574 scope.go:117] "RemoveContainer" containerID="54c5c6aeb417f8dcbef73dc63bbfa63a2ae49228581a66f0fa919bc224366ac4" Apr 24 21:33:21.586456 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:33:21.586426 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54c5c6aeb417f8dcbef73dc63bbfa63a2ae49228581a66f0fa919bc224366ac4\": container with ID starting with 54c5c6aeb417f8dcbef73dc63bbfa63a2ae49228581a66f0fa919bc224366ac4 not found: ID does not exist" containerID="54c5c6aeb417f8dcbef73dc63bbfa63a2ae49228581a66f0fa919bc224366ac4" Apr 24 21:33:21.586551 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.586454 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54c5c6aeb417f8dcbef73dc63bbfa63a2ae49228581a66f0fa919bc224366ac4"} err="failed to get container status \"54c5c6aeb417f8dcbef73dc63bbfa63a2ae49228581a66f0fa919bc224366ac4\": rpc error: code = NotFound desc = could not find container \"54c5c6aeb417f8dcbef73dc63bbfa63a2ae49228581a66f0fa919bc224366ac4\": container with ID starting with 54c5c6aeb417f8dcbef73dc63bbfa63a2ae49228581a66f0fa919bc224366ac4 not found: ID does not exist" Apr 24 21:33:21.586551 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.586476 2574 scope.go:117] "RemoveContainer" containerID="985687bb9d968e217885debb4265656f39a6910078302ccbc2bd4ef4d25c19dc" Apr 24 21:33:21.586814 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:33:21.586793 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"985687bb9d968e217885debb4265656f39a6910078302ccbc2bd4ef4d25c19dc\": container with ID starting with 985687bb9d968e217885debb4265656f39a6910078302ccbc2bd4ef4d25c19dc not found: ID does not exist" containerID="985687bb9d968e217885debb4265656f39a6910078302ccbc2bd4ef4d25c19dc" Apr 24 21:33:21.586886 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.586823 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985687bb9d968e217885debb4265656f39a6910078302ccbc2bd4ef4d25c19dc"} err="failed to get container status \"985687bb9d968e217885debb4265656f39a6910078302ccbc2bd4ef4d25c19dc\": rpc error: code = NotFound desc = could not find container \"985687bb9d968e217885debb4265656f39a6910078302ccbc2bd4ef4d25c19dc\": container with ID starting with 985687bb9d968e217885debb4265656f39a6910078302ccbc2bd4ef4d25c19dc not found: ID does not exist" Apr 24 21:33:21.586886 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.586844 2574 scope.go:117] "RemoveContainer" containerID="afdca7651dc0c71cfe93d11993a17bae2874d58e4e9b0b4cfc949a2c3caeab53" Apr 24 21:33:21.587013 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.586989 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:33:21.587110 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:33:21.587089 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afdca7651dc0c71cfe93d11993a17bae2874d58e4e9b0b4cfc949a2c3caeab53\": container with ID starting with afdca7651dc0c71cfe93d11993a17bae2874d58e4e9b0b4cfc949a2c3caeab53 not found: ID does not exist" containerID="afdca7651dc0c71cfe93d11993a17bae2874d58e4e9b0b4cfc949a2c3caeab53" Apr 24 21:33:21.587157 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.587117 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afdca7651dc0c71cfe93d11993a17bae2874d58e4e9b0b4cfc949a2c3caeab53"} err="failed to get container status \"afdca7651dc0c71cfe93d11993a17bae2874d58e4e9b0b4cfc949a2c3caeab53\": rpc error: code = NotFound desc = could not find container \"afdca7651dc0c71cfe93d11993a17bae2874d58e4e9b0b4cfc949a2c3caeab53\": container with ID starting with afdca7651dc0c71cfe93d11993a17bae2874d58e4e9b0b4cfc949a2c3caeab53 not found: ID does not exist" Apr 24 21:33:21.587157 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.587139 2574 scope.go:117] "RemoveContainer" containerID="3b74c67778b208e9433acdfeba27bbce8cb7ce5fec3dfab52d6e59b7f71f1605" Apr 24 21:33:21.587437 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:33:21.587412 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b74c67778b208e9433acdfeba27bbce8cb7ce5fec3dfab52d6e59b7f71f1605\": container with ID starting with 3b74c67778b208e9433acdfeba27bbce8cb7ce5fec3dfab52d6e59b7f71f1605 not found: ID does not exist" containerID="3b74c67778b208e9433acdfeba27bbce8cb7ce5fec3dfab52d6e59b7f71f1605" Apr 24 21:33:21.587528 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.587442 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b74c67778b208e9433acdfeba27bbce8cb7ce5fec3dfab52d6e59b7f71f1605"} err="failed to get container status \"3b74c67778b208e9433acdfeba27bbce8cb7ce5fec3dfab52d6e59b7f71f1605\": rpc error: code = NotFound desc = could not find container \"3b74c67778b208e9433acdfeba27bbce8cb7ce5fec3dfab52d6e59b7f71f1605\": container with ID starting with 3b74c67778b208e9433acdfeba27bbce8cb7ce5fec3dfab52d6e59b7f71f1605 not found: ID does not exist" Apr 24 21:33:21.587528 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.587463 2574 scope.go:117] "RemoveContainer" containerID="9f6e8b04827930f579ed02910248c0f53de8d96049d28a7edd48c85f511f7223" Apr 24 21:33:21.587756 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:33:21.587738 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f6e8b04827930f579ed02910248c0f53de8d96049d28a7edd48c85f511f7223\": container with ID starting with 9f6e8b04827930f579ed02910248c0f53de8d96049d28a7edd48c85f511f7223 not found: ID does not exist" containerID="9f6e8b04827930f579ed02910248c0f53de8d96049d28a7edd48c85f511f7223" Apr 24 21:33:21.587827 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.587763 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f6e8b04827930f579ed02910248c0f53de8d96049d28a7edd48c85f511f7223"} err="failed to get container status \"9f6e8b04827930f579ed02910248c0f53de8d96049d28a7edd48c85f511f7223\": rpc error: code = NotFound desc = could not find container \"9f6e8b04827930f579ed02910248c0f53de8d96049d28a7edd48c85f511f7223\": container with ID starting with 9f6e8b04827930f579ed02910248c0f53de8d96049d28a7edd48c85f511f7223 not found: ID does not exist" Apr 24 21:33:21.587827 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.587782 2574 scope.go:117] "RemoveContainer" containerID="ab7015300691528ca626f2272af9ca1ca36b3d672f215e61ed49f42ab91b809b" Apr 24 21:33:21.588057 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:33:21.588027 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab7015300691528ca626f2272af9ca1ca36b3d672f215e61ed49f42ab91b809b\": container with ID starting with ab7015300691528ca626f2272af9ca1ca36b3d672f215e61ed49f42ab91b809b not found: ID does not exist" containerID="ab7015300691528ca626f2272af9ca1ca36b3d672f215e61ed49f42ab91b809b" Apr 24 21:33:21.588140 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.588061 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab7015300691528ca626f2272af9ca1ca36b3d672f215e61ed49f42ab91b809b"} err="failed to get container status \"ab7015300691528ca626f2272af9ca1ca36b3d672f215e61ed49f42ab91b809b\": rpc error: code = NotFound desc = could not find container \"ab7015300691528ca626f2272af9ca1ca36b3d672f215e61ed49f42ab91b809b\": container with ID starting with ab7015300691528ca626f2272af9ca1ca36b3d672f215e61ed49f42ab91b809b not found: ID does not exist" Apr 24 21:33:21.629892 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.629871 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:33:21.630111 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630100 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="prometheus" Apr 24 21:33:21.630111 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630112 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="prometheus" Apr 24 21:33:21.630244 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630118 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="config-reloader" Apr 24 21:33:21.630244 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630123 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="config-reloader" Apr 24 21:33:21.630244 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630133 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="kube-rbac-proxy" Apr 24 21:33:21.630244 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630139 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="kube-rbac-proxy" Apr 24 21:33:21.630244 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630151 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="kube-rbac-proxy-web" Apr 24 21:33:21.630244 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630156 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="kube-rbac-proxy-web" Apr 24 21:33:21.630244 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630162 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="kube-rbac-proxy-thanos" Apr 24 21:33:21.630244 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630167 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="kube-rbac-proxy-thanos" Apr 24 21:33:21.630244 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630178 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="init-config-reloader" Apr 24 21:33:21.630244 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630183 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="init-config-reloader" Apr 24 21:33:21.630244 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630190 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="thanos-sidecar" Apr 24 21:33:21.630244 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630195 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="thanos-sidecar" Apr 24 21:33:21.630244 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630249 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="prometheus" Apr 24 21:33:21.630808 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630261 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="config-reloader" Apr 24 21:33:21.630808 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630271 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="thanos-sidecar" Apr 24 21:33:21.630808 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630277 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="kube-rbac-proxy-web" Apr 24 21:33:21.630808 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630283 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="kube-rbac-proxy" Apr 24 21:33:21.630808 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.630289 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" containerName="kube-rbac-proxy-thanos" Apr 24 21:33:21.634944 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.634929 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.637510 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.637492 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 21:33:21.637603 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.637511 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 21:33:21.637603 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.637528 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 21:33:21.637603 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.637543 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8u3lfl3th7m67\"" Apr 24 21:33:21.637842 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.637828 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 21:33:21.638037 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.638024 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 21:33:21.638100 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.638078 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 21:33:21.638100 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.638086 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 21:33:21.638192 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.638137 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 21:33:21.638192 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.638144 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-z9sfc\"" Apr 24 21:33:21.638192 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.638157 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 21:33:21.638192 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.638091 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 21:33:21.638846 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.638829 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 21:33:21.640742 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.640720 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 21:33:21.643631 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.643613 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 21:33:21.652321 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.652299 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:33:21.732116 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.732089 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c60e069f-41db-46e2-8df5-2fb3c07c7091-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.732261 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.732133 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-web-config\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.732261 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.732163 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.732261 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.732190 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-config\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.732261 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.732253 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c60e069f-41db-46e2-8df5-2fb3c07c7091-config-out\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.732475 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.732279 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.732475 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.732314 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c60e069f-41db-46e2-8df5-2fb3c07c7091-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.732475 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.732344 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c60e069f-41db-46e2-8df5-2fb3c07c7091-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.732475 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.732372 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c60e069f-41db-46e2-8df5-2fb3c07c7091-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.732475 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.732402 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp8j6\" (UniqueName: \"kubernetes.io/projected/c60e069f-41db-46e2-8df5-2fb3c07c7091-kube-api-access-bp8j6\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.732475 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.732428 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.732475 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.732464 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c60e069f-41db-46e2-8df5-2fb3c07c7091-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.732708 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.732532 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.732708 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.732572 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c60e069f-41db-46e2-8df5-2fb3c07c7091-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.732708 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.732608 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.732708 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.732638 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.732708 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.732687 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c60e069f-41db-46e2-8df5-2fb3c07c7091-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.732856 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.732708 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.833548 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.833478 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c60e069f-41db-46e2-8df5-2fb3c07c7091-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.833548 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.833515 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.833548 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.833543 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c60e069f-41db-46e2-8df5-2fb3c07c7091-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.833765 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.833682 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.833765 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.833724 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.833879 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.833857 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c60e069f-41db-46e2-8df5-2fb3c07c7091-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.833940 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.833897 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.833940 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.833931 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c60e069f-41db-46e2-8df5-2fb3c07c7091-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.834031 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.833973 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-web-config\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.834031 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.834007 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.834114 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.834036 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-config\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.834732 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.834404 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c60e069f-41db-46e2-8df5-2fb3c07c7091-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.834732 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.834522 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c60e069f-41db-46e2-8df5-2fb3c07c7091-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.834910 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.834739 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c60e069f-41db-46e2-8df5-2fb3c07c7091-config-out\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.834910 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.834778 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.834910 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.834811 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c60e069f-41db-46e2-8df5-2fb3c07c7091-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.834910 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.834837 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c60e069f-41db-46e2-8df5-2fb3c07c7091-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.834910 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.834865 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c60e069f-41db-46e2-8df5-2fb3c07c7091-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.834910 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.834896 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bp8j6\" (UniqueName: \"kubernetes.io/projected/c60e069f-41db-46e2-8df5-2fb3c07c7091-kube-api-access-bp8j6\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.835245 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.834921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.836914 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.836610 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.836914 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.836816 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.836914 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.836824 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.837886 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.837271 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c60e069f-41db-46e2-8df5-2fb3c07c7091-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.837886 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.837564 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-web-config\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.837886 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.837650 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.837886 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.837681 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c60e069f-41db-46e2-8df5-2fb3c07c7091-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.837886 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.837782 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c60e069f-41db-46e2-8df5-2fb3c07c7091-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.837886 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.837818 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.838271 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.837949 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c60e069f-41db-46e2-8df5-2fb3c07c7091-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.838271 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.838046 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c60e069f-41db-46e2-8df5-2fb3c07c7091-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.838976 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.838948 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.839041 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.839029 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-config\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.839568 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.839549 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c60e069f-41db-46e2-8df5-2fb3c07c7091-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.839703 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.839686 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c60e069f-41db-46e2-8df5-2fb3c07c7091-config-out\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.851830 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.851810 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp8j6\" (UniqueName: \"kubernetes.io/projected/c60e069f-41db-46e2-8df5-2fb3c07c7091-kube-api-access-bp8j6\") pod \"prometheus-k8s-0\" (UID: \"c60e069f-41db-46e2-8df5-2fb3c07c7091\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:21.944039 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:21.944008 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:22.446818 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:22.446774 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:33:22.448692 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:33:22.448661 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc60e069f_41db_46e2_8df5_2fb3c07c7091.slice/crio-8a3c1dd93add3a09406267bac364c4aaa3f465580d0c2b9fcc229c17a642d0c5 WatchSource:0}: Error finding container 8a3c1dd93add3a09406267bac364c4aaa3f465580d0c2b9fcc229c17a642d0c5: Status 404 returned error can't find the container with id 8a3c1dd93add3a09406267bac364c4aaa3f465580d0c2b9fcc229c17a642d0c5 Apr 24 21:33:22.541350 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:22.541319 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" event={"ID":"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180","Type":"ContainerStarted","Data":"9ca1292084ae539526073086ae66b7b23273357d6187fe84af71bd4ead525466"} Apr 24 21:33:22.541598 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:22.541353 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" event={"ID":"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180","Type":"ContainerStarted","Data":"f7775546fcbc5dc986eea9c5fb0e17057c2d793212c5689484ef937024b80457"} Apr 24 21:33:22.542549 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:22.542527 2574 generic.go:358] "Generic (PLEG): container finished" podID="c60e069f-41db-46e2-8df5-2fb3c07c7091" containerID="adbb4cf01018b9eb88cb25f00cbdebe349b2cf43802708f0c9f49146ca4cd8e3" exitCode=0 Apr 24 21:33:22.542627 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:22.542605 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c60e069f-41db-46e2-8df5-2fb3c07c7091","Type":"ContainerDied","Data":"adbb4cf01018b9eb88cb25f00cbdebe349b2cf43802708f0c9f49146ca4cd8e3"} Apr 24 21:33:22.542675 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:22.542643 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c60e069f-41db-46e2-8df5-2fb3c07c7091","Type":"ContainerStarted","Data":"8a3c1dd93add3a09406267bac364c4aaa3f465580d0c2b9fcc229c17a642d0c5"} Apr 24 21:33:22.716715 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:22.716688 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ed03c7e-bf02-42a0-90e4-c8c8027b31c2" path="/var/lib/kubelet/pods/1ed03c7e-bf02-42a0-90e4-c8c8027b31c2/volumes" Apr 24 21:33:23.552281 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:23.552241 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" event={"ID":"0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180","Type":"ContainerStarted","Data":"4fc67676d0e40e6edb6c2758b5af507b62e57e154f748c5896b92ac2f4bb202a"} Apr 24 21:33:23.555388 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:23.555358 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c60e069f-41db-46e2-8df5-2fb3c07c7091","Type":"ContainerStarted","Data":"bc88b28f103783eecc64af029d2de8e34f777587d8ea04d7ab08cfe2d527e66a"} Apr 24 21:33:23.555530 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:23.555395 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c60e069f-41db-46e2-8df5-2fb3c07c7091","Type":"ContainerStarted","Data":"b670185b7285053c6f5e71e964c77727e6a2e210b4d902fd38d12a2cbdef5285"} Apr 24 21:33:23.555530 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:23.555415 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c60e069f-41db-46e2-8df5-2fb3c07c7091","Type":"ContainerStarted","Data":"9e30920dc706758110c36341fd01f145ed7a7856d479a1d570023ccd01717a8f"} Apr 24 21:33:23.555530 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:23.555426 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c60e069f-41db-46e2-8df5-2fb3c07c7091","Type":"ContainerStarted","Data":"82cf1a523cb4b7f226b62b8e744c3d40e3e66cce6e022a366453a2c347a0d4bb"} Apr 24 21:33:23.555530 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:23.555439 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c60e069f-41db-46e2-8df5-2fb3c07c7091","Type":"ContainerStarted","Data":"f6bb25961bd673c42b325ebad6271d3a497dd1d1bc3ff1274ba4de42cc33c243"} Apr 24 21:33:23.555530 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:23.555450 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c60e069f-41db-46e2-8df5-2fb3c07c7091","Type":"ContainerStarted","Data":"905fc7f312fe36949565c8b8d6f3db7ce98cd6d5ee356c4184155b4c3df082f1"} Apr 24 21:33:23.579761 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:23.579633 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6569cfb4f9-h2stm" podStartSLOduration=2.429113541 podStartE2EDuration="4.579615347s" podCreationTimestamp="2026-04-24 21:33:19 +0000 UTC" firstStartedPulling="2026-04-24 21:33:20.218792938 +0000 UTC m=+274.089770421" lastFinishedPulling="2026-04-24 21:33:22.369294735 +0000 UTC m=+276.240272227" observedRunningTime="2026-04-24 21:33:23.578218529 +0000 UTC m=+277.449196031" watchObservedRunningTime="2026-04-24 21:33:23.579615347 +0000 UTC m=+277.450592848" Apr 24 21:33:23.609092 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:23.609046 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.609034799 podStartE2EDuration="2.609034799s" podCreationTimestamp="2026-04-24 21:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:33:23.607729178 +0000 UTC m=+277.478706695" watchObservedRunningTime="2026-04-24 21:33:23.609034799 +0000 UTC m=+277.480012298" Apr 24 21:33:26.944344 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:26.944311 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:33:46.600627 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:46.600600 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovn-acl-logging/0.log" Apr 24 21:33:46.601694 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:46.601666 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovn-acl-logging/0.log" Apr 24 21:33:46.604784 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:33:46.604765 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:34:21.944961 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:34:21.944923 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:34:21.960646 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:34:21.960623 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:34:22.738797 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:34:22.738773 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:38:46.622799 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:38:46.622773 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovn-acl-logging/0.log" Apr 24 21:38:46.624894 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:38:46.624872 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovn-acl-logging/0.log" Apr 24 21:39:41.749516 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:41.749478 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-sc46j"] Apr 24 21:39:41.752621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:41.752585 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sc46j" Apr 24 21:39:41.755011 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:41.754985 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:39:41.755179 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:41.755037 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:39:41.755179 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:41.755141 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:39:41.755355 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:41.755335 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-vf9w4\"" Apr 24 21:39:41.761179 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:41.761156 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-sc46j"] Apr 24 21:39:41.853008 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:41.852982 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7fwz\" (UniqueName: \"kubernetes.io/projected/f6e13c60-3a84-487a-8769-775e65a5c40e-kube-api-access-v7fwz\") pod \"s3-init-sc46j\" (UID: \"f6e13c60-3a84-487a-8769-775e65a5c40e\") " pod="kserve/s3-init-sc46j" Apr 24 21:39:41.954306 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:41.954283 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7fwz\" (UniqueName: \"kubernetes.io/projected/f6e13c60-3a84-487a-8769-775e65a5c40e-kube-api-access-v7fwz\") pod \"s3-init-sc46j\" (UID: \"f6e13c60-3a84-487a-8769-775e65a5c40e\") " pod="kserve/s3-init-sc46j" Apr 24 21:39:41.964063 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:41.964038 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7fwz\" (UniqueName: \"kubernetes.io/projected/f6e13c60-3a84-487a-8769-775e65a5c40e-kube-api-access-v7fwz\") pod \"s3-init-sc46j\" (UID: \"f6e13c60-3a84-487a-8769-775e65a5c40e\") " pod="kserve/s3-init-sc46j" Apr 24 21:39:42.063478 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:42.063422 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sc46j" Apr 24 21:39:42.179116 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:42.179086 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-sc46j"] Apr 24 21:39:42.182123 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:39:42.182085 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6e13c60_3a84_487a_8769_775e65a5c40e.slice/crio-cc158e4e74083b2425e53fa92b855700bedefb6bb539c0d0441429fce9dc3fc3 WatchSource:0}: Error finding container cc158e4e74083b2425e53fa92b855700bedefb6bb539c0d0441429fce9dc3fc3: Status 404 returned error can't find the container with id cc158e4e74083b2425e53fa92b855700bedefb6bb539c0d0441429fce9dc3fc3 Apr 24 21:39:42.183967 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:42.183951 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:39:42.635478 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:42.635436 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sc46j" event={"ID":"f6e13c60-3a84-487a-8769-775e65a5c40e","Type":"ContainerStarted","Data":"cc158e4e74083b2425e53fa92b855700bedefb6bb539c0d0441429fce9dc3fc3"} Apr 24 21:39:46.653124 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:46.653089 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sc46j" event={"ID":"f6e13c60-3a84-487a-8769-775e65a5c40e","Type":"ContainerStarted","Data":"302de834ea3503dfa0ec01ebeecf4e9a5d207be440156b93cce27650d21fc6b6"} Apr 24 21:39:46.671347 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:46.671156 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-sc46j" podStartSLOduration=1.3277676330000001 podStartE2EDuration="5.671140701s" podCreationTimestamp="2026-04-24 21:39:41 +0000 UTC" firstStartedPulling="2026-04-24 21:39:42.184075108 +0000 UTC m=+656.055052588" lastFinishedPulling="2026-04-24 21:39:46.527448176 +0000 UTC m=+660.398425656" observedRunningTime="2026-04-24 21:39:46.670599834 +0000 UTC m=+660.541577338" watchObservedRunningTime="2026-04-24 21:39:46.671140701 +0000 UTC m=+660.542118201" Apr 24 21:39:49.662088 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:49.662057 2574 generic.go:358] "Generic (PLEG): container finished" podID="f6e13c60-3a84-487a-8769-775e65a5c40e" containerID="302de834ea3503dfa0ec01ebeecf4e9a5d207be440156b93cce27650d21fc6b6" exitCode=0 Apr 24 21:39:49.662409 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:49.662120 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sc46j" event={"ID":"f6e13c60-3a84-487a-8769-775e65a5c40e","Type":"ContainerDied","Data":"302de834ea3503dfa0ec01ebeecf4e9a5d207be440156b93cce27650d21fc6b6"} Apr 24 21:39:50.796863 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:50.796842 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sc46j" Apr 24 21:39:50.925630 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:50.925566 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7fwz\" (UniqueName: \"kubernetes.io/projected/f6e13c60-3a84-487a-8769-775e65a5c40e-kube-api-access-v7fwz\") pod \"f6e13c60-3a84-487a-8769-775e65a5c40e\" (UID: \"f6e13c60-3a84-487a-8769-775e65a5c40e\") " Apr 24 21:39:50.927586 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:50.927558 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6e13c60-3a84-487a-8769-775e65a5c40e-kube-api-access-v7fwz" (OuterVolumeSpecName: "kube-api-access-v7fwz") pod "f6e13c60-3a84-487a-8769-775e65a5c40e" (UID: "f6e13c60-3a84-487a-8769-775e65a5c40e"). InnerVolumeSpecName "kube-api-access-v7fwz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:39:51.026072 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:51.026051 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v7fwz\" (UniqueName: \"kubernetes.io/projected/f6e13c60-3a84-487a-8769-775e65a5c40e-kube-api-access-v7fwz\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:39:51.669936 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:51.669909 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sc46j" Apr 24 21:39:51.670090 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:51.669938 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sc46j" event={"ID":"f6e13c60-3a84-487a-8769-775e65a5c40e","Type":"ContainerDied","Data":"cc158e4e74083b2425e53fa92b855700bedefb6bb539c0d0441429fce9dc3fc3"} Apr 24 21:39:51.670090 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:39:51.669969 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc158e4e74083b2425e53fa92b855700bedefb6bb539c0d0441429fce9dc3fc3" Apr 24 21:43:46.649009 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:43:46.648978 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovn-acl-logging/0.log" Apr 24 21:43:46.651598 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:43:46.651575 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovn-acl-logging/0.log" Apr 24 21:48:46.669318 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:48:46.669282 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovn-acl-logging/0.log" Apr 24 21:48:46.672219 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:48:46.672195 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovn-acl-logging/0.log" Apr 24 21:53:32.339670 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:32.339637 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vlntm/must-gather-2kxkw"] Apr 24 21:53:32.340128 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:32.339933 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6e13c60-3a84-487a-8769-775e65a5c40e" containerName="s3-init" Apr 24 21:53:32.340128 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:32.339943 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6e13c60-3a84-487a-8769-775e65a5c40e" containerName="s3-init" Apr 24 21:53:32.340128 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:32.339992 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6e13c60-3a84-487a-8769-775e65a5c40e" containerName="s3-init" Apr 24 21:53:32.343111 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:32.343092 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlntm/must-gather-2kxkw" Apr 24 21:53:32.345519 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:32.345493 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vlntm\"/\"default-dockercfg-9gzms\"" Apr 24 21:53:32.345628 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:32.345522 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vlntm\"/\"kube-root-ca.crt\"" Apr 24 21:53:32.345628 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:32.345582 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vlntm\"/\"openshift-service-ca.crt\"" Apr 24 21:53:32.352656 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:32.352635 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vlntm/must-gather-2kxkw"] Apr 24 21:53:32.473567 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:32.473544 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5ec46c47-92c3-4850-a317-ac67a644e567-must-gather-output\") pod \"must-gather-2kxkw\" (UID: \"5ec46c47-92c3-4850-a317-ac67a644e567\") " pod="openshift-must-gather-vlntm/must-gather-2kxkw" Apr 24 21:53:32.473699 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:32.473592 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdqhk\" (UniqueName: \"kubernetes.io/projected/5ec46c47-92c3-4850-a317-ac67a644e567-kube-api-access-vdqhk\") pod \"must-gather-2kxkw\" (UID: \"5ec46c47-92c3-4850-a317-ac67a644e567\") " pod="openshift-must-gather-vlntm/must-gather-2kxkw" Apr 24 21:53:32.574621 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:32.574579 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5ec46c47-92c3-4850-a317-ac67a644e567-must-gather-output\") pod \"must-gather-2kxkw\" (UID: \"5ec46c47-92c3-4850-a317-ac67a644e567\") " pod="openshift-must-gather-vlntm/must-gather-2kxkw" Apr 24 21:53:32.574738 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:32.574650 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdqhk\" (UniqueName: \"kubernetes.io/projected/5ec46c47-92c3-4850-a317-ac67a644e567-kube-api-access-vdqhk\") pod \"must-gather-2kxkw\" (UID: \"5ec46c47-92c3-4850-a317-ac67a644e567\") " pod="openshift-must-gather-vlntm/must-gather-2kxkw" Apr 24 21:53:32.574914 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:32.574896 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5ec46c47-92c3-4850-a317-ac67a644e567-must-gather-output\") pod \"must-gather-2kxkw\" (UID: \"5ec46c47-92c3-4850-a317-ac67a644e567\") " pod="openshift-must-gather-vlntm/must-gather-2kxkw" Apr 24 21:53:32.584084 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:32.584049 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdqhk\" (UniqueName: \"kubernetes.io/projected/5ec46c47-92c3-4850-a317-ac67a644e567-kube-api-access-vdqhk\") pod \"must-gather-2kxkw\" (UID: \"5ec46c47-92c3-4850-a317-ac67a644e567\") " pod="openshift-must-gather-vlntm/must-gather-2kxkw" Apr 24 21:53:32.652479 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:32.652458 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlntm/must-gather-2kxkw" Apr 24 21:53:32.786483 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:32.786427 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vlntm/must-gather-2kxkw"] Apr 24 21:53:32.789311 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:53:32.789284 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ec46c47_92c3_4850_a317_ac67a644e567.slice/crio-0da7e2fc4f82dca1bea6189da4ce1e0fcdfe74778b8bdef87fb4ceb6c37f0df5 WatchSource:0}: Error finding container 0da7e2fc4f82dca1bea6189da4ce1e0fcdfe74778b8bdef87fb4ceb6c37f0df5: Status 404 returned error can't find the container with id 0da7e2fc4f82dca1bea6189da4ce1e0fcdfe74778b8bdef87fb4ceb6c37f0df5 Apr 24 21:53:32.790927 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:32.790910 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:53:33.024140 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:33.024083 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vlntm/must-gather-2kxkw" event={"ID":"5ec46c47-92c3-4850-a317-ac67a644e567","Type":"ContainerStarted","Data":"0da7e2fc4f82dca1bea6189da4ce1e0fcdfe74778b8bdef87fb4ceb6c37f0df5"} Apr 24 21:53:38.041374 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:38.041340 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vlntm/must-gather-2kxkw" event={"ID":"5ec46c47-92c3-4850-a317-ac67a644e567","Type":"ContainerStarted","Data":"14746c4dfe4d0fdbea66a56305b280406f4bcac1ec707c490993cbf9b424467f"} Apr 24 21:53:38.041374 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:38.041377 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vlntm/must-gather-2kxkw" event={"ID":"5ec46c47-92c3-4850-a317-ac67a644e567","Type":"ContainerStarted","Data":"17fbfdef086442d0ef1f456401a4a9b3d11d5e6511dafdfd2ea331adfc9fb633"} Apr 24 21:53:38.059902 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:38.059857 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vlntm/must-gather-2kxkw" podStartSLOduration=1.5472331910000001 podStartE2EDuration="6.059843828s" podCreationTimestamp="2026-04-24 21:53:32 +0000 UTC" firstStartedPulling="2026-04-24 21:53:32.791032353 +0000 UTC m=+1486.662009833" lastFinishedPulling="2026-04-24 21:53:37.303642985 +0000 UTC m=+1491.174620470" observedRunningTime="2026-04-24 21:53:38.057621624 +0000 UTC m=+1491.928599123" watchObservedRunningTime="2026-04-24 21:53:38.059843828 +0000 UTC m=+1491.930821327" Apr 24 21:53:46.698358 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:46.698328 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovn-acl-logging/0.log" Apr 24 21:53:46.698808 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:46.698783 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovn-acl-logging/0.log" Apr 24 21:53:56.097170 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:56.097134 2574 generic.go:358] "Generic (PLEG): container finished" podID="5ec46c47-92c3-4850-a317-ac67a644e567" containerID="17fbfdef086442d0ef1f456401a4a9b3d11d5e6511dafdfd2ea331adfc9fb633" exitCode=0 Apr 24 21:53:56.097575 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:56.097196 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vlntm/must-gather-2kxkw" event={"ID":"5ec46c47-92c3-4850-a317-ac67a644e567","Type":"ContainerDied","Data":"17fbfdef086442d0ef1f456401a4a9b3d11d5e6511dafdfd2ea331adfc9fb633"} Apr 24 21:53:56.097575 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:56.097526 2574 scope.go:117] "RemoveContainer" containerID="17fbfdef086442d0ef1f456401a4a9b3d11d5e6511dafdfd2ea331adfc9fb633" Apr 24 21:53:56.131650 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:56.131607 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vlntm_must-gather-2kxkw_5ec46c47-92c3-4850-a317-ac67a644e567/gather/0.log" Apr 24 21:53:59.199364 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:59.199277 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-4xgsz_f1b28df9-d260-40f6-ba3b-63772a458eeb/global-pull-secret-syncer/0.log" Apr 24 21:53:59.416084 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:59.416056 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-q8sjc_f8b3ba0e-889f-4f1c-9e20-33df1e811158/konnectivity-agent/0.log" Apr 24 21:53:59.486689 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:53:59.486615 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-162.ec2.internal_a49fc3c8bc7b82f59a5e7858c439f11d/haproxy/0.log" Apr 24 21:54:01.497531 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:01.497499 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vlntm/must-gather-2kxkw"] Apr 24 21:54:01.497933 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:01.497771 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-vlntm/must-gather-2kxkw" podUID="5ec46c47-92c3-4850-a317-ac67a644e567" containerName="copy" containerID="cri-o://14746c4dfe4d0fdbea66a56305b280406f4bcac1ec707c490993cbf9b424467f" gracePeriod=2 Apr 24 21:54:01.499682 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:01.499641 2574 status_manager.go:895] "Failed to get status for pod" podUID="5ec46c47-92c3-4850-a317-ac67a644e567" pod="openshift-must-gather-vlntm/must-gather-2kxkw" err="pods \"must-gather-2kxkw\" is forbidden: User \"system:node:ip-10-0-142-162.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vlntm\": no relationship found between node 'ip-10-0-142-162.ec2.internal' and this object" Apr 24 21:54:01.500354 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:01.500332 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vlntm/must-gather-2kxkw"] Apr 24 21:54:01.717685 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:01.717663 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vlntm_must-gather-2kxkw_5ec46c47-92c3-4850-a317-ac67a644e567/copy/0.log" Apr 24 21:54:01.718011 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:01.717996 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlntm/must-gather-2kxkw" Apr 24 21:54:01.720021 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:01.719998 2574 status_manager.go:895] "Failed to get status for pod" podUID="5ec46c47-92c3-4850-a317-ac67a644e567" pod="openshift-must-gather-vlntm/must-gather-2kxkw" err="pods \"must-gather-2kxkw\" is forbidden: User \"system:node:ip-10-0-142-162.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vlntm\": no relationship found between node 'ip-10-0-142-162.ec2.internal' and this object" Apr 24 21:54:01.825083 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:01.825033 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5ec46c47-92c3-4850-a317-ac67a644e567-must-gather-output\") pod \"5ec46c47-92c3-4850-a317-ac67a644e567\" (UID: \"5ec46c47-92c3-4850-a317-ac67a644e567\") " Apr 24 21:54:01.825083 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:01.825076 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdqhk\" (UniqueName: \"kubernetes.io/projected/5ec46c47-92c3-4850-a317-ac67a644e567-kube-api-access-vdqhk\") pod \"5ec46c47-92c3-4850-a317-ac67a644e567\" (UID: \"5ec46c47-92c3-4850-a317-ac67a644e567\") " Apr 24 21:54:01.826452 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:01.826427 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec46c47-92c3-4850-a317-ac67a644e567-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "5ec46c47-92c3-4850-a317-ac67a644e567" (UID: "5ec46c47-92c3-4850-a317-ac67a644e567"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:54:01.827172 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:01.827147 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec46c47-92c3-4850-a317-ac67a644e567-kube-api-access-vdqhk" (OuterVolumeSpecName: "kube-api-access-vdqhk") pod "5ec46c47-92c3-4850-a317-ac67a644e567" (UID: "5ec46c47-92c3-4850-a317-ac67a644e567"). InnerVolumeSpecName "kube-api-access-vdqhk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:54:01.925557 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:01.925536 2574 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5ec46c47-92c3-4850-a317-ac67a644e567-must-gather-output\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:54:01.925557 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:01.925558 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vdqhk\" (UniqueName: \"kubernetes.io/projected/5ec46c47-92c3-4850-a317-ac67a644e567-kube-api-access-vdqhk\") on node \"ip-10-0-142-162.ec2.internal\" DevicePath \"\"" Apr 24 21:54:02.115589 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.115567 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vlntm_must-gather-2kxkw_5ec46c47-92c3-4850-a317-ac67a644e567/copy/0.log" Apr 24 21:54:02.115873 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.115851 2574 generic.go:358] "Generic (PLEG): container finished" podID="5ec46c47-92c3-4850-a317-ac67a644e567" containerID="14746c4dfe4d0fdbea66a56305b280406f4bcac1ec707c490993cbf9b424467f" exitCode=143 Apr 24 21:54:02.115932 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.115901 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vlntm/must-gather-2kxkw" Apr 24 21:54:02.115970 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.115940 2574 scope.go:117] "RemoveContainer" containerID="14746c4dfe4d0fdbea66a56305b280406f4bcac1ec707c490993cbf9b424467f" Apr 24 21:54:02.118041 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.118018 2574 status_manager.go:895] "Failed to get status for pod" podUID="5ec46c47-92c3-4850-a317-ac67a644e567" pod="openshift-must-gather-vlntm/must-gather-2kxkw" err="pods \"must-gather-2kxkw\" is forbidden: User \"system:node:ip-10-0-142-162.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vlntm\": no relationship found between node 'ip-10-0-142-162.ec2.internal' and this object" Apr 24 21:54:02.124697 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.124679 2574 scope.go:117] "RemoveContainer" containerID="17fbfdef086442d0ef1f456401a4a9b3d11d5e6511dafdfd2ea331adfc9fb633" Apr 24 21:54:02.126962 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.126933 2574 status_manager.go:895] "Failed to get status for pod" podUID="5ec46c47-92c3-4850-a317-ac67a644e567" pod="openshift-must-gather-vlntm/must-gather-2kxkw" err="pods \"must-gather-2kxkw\" is forbidden: User \"system:node:ip-10-0-142-162.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vlntm\": no relationship found between node 'ip-10-0-142-162.ec2.internal' and this object" Apr 24 21:54:02.135850 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.135835 2574 scope.go:117] "RemoveContainer" containerID="14746c4dfe4d0fdbea66a56305b280406f4bcac1ec707c490993cbf9b424467f" Apr 24 21:54:02.136098 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:54:02.136075 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14746c4dfe4d0fdbea66a56305b280406f4bcac1ec707c490993cbf9b424467f\": container with ID starting with 14746c4dfe4d0fdbea66a56305b280406f4bcac1ec707c490993cbf9b424467f not found: ID does not exist" containerID="14746c4dfe4d0fdbea66a56305b280406f4bcac1ec707c490993cbf9b424467f" Apr 24 21:54:02.136156 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.136109 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14746c4dfe4d0fdbea66a56305b280406f4bcac1ec707c490993cbf9b424467f"} err="failed to get container status \"14746c4dfe4d0fdbea66a56305b280406f4bcac1ec707c490993cbf9b424467f\": rpc error: code = NotFound desc = could not find container \"14746c4dfe4d0fdbea66a56305b280406f4bcac1ec707c490993cbf9b424467f\": container with ID starting with 14746c4dfe4d0fdbea66a56305b280406f4bcac1ec707c490993cbf9b424467f not found: ID does not exist" Apr 24 21:54:02.136156 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.136131 2574 scope.go:117] "RemoveContainer" containerID="17fbfdef086442d0ef1f456401a4a9b3d11d5e6511dafdfd2ea331adfc9fb633" Apr 24 21:54:02.136394 ip-10-0-142-162 kubenswrapper[2574]: E0424 21:54:02.136374 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17fbfdef086442d0ef1f456401a4a9b3d11d5e6511dafdfd2ea331adfc9fb633\": container with ID starting with 17fbfdef086442d0ef1f456401a4a9b3d11d5e6511dafdfd2ea331adfc9fb633 not found: ID does not exist" containerID="17fbfdef086442d0ef1f456401a4a9b3d11d5e6511dafdfd2ea331adfc9fb633" Apr 24 21:54:02.136433 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.136400 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17fbfdef086442d0ef1f456401a4a9b3d11d5e6511dafdfd2ea331adfc9fb633"} err="failed to get container status \"17fbfdef086442d0ef1f456401a4a9b3d11d5e6511dafdfd2ea331adfc9fb633\": rpc error: code = NotFound desc = could not find container \"17fbfdef086442d0ef1f456401a4a9b3d11d5e6511dafdfd2ea331adfc9fb633\": container with ID starting with 17fbfdef086442d0ef1f456401a4a9b3d11d5e6511dafdfd2ea331adfc9fb633 not found: ID does not exist" Apr 24 21:54:02.512053 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.512023 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_23342931-f34f-4e4b-92c6-85da5c12481d/alertmanager/0.log" Apr 24 21:54:02.546105 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.546080 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_23342931-f34f-4e4b-92c6-85da5c12481d/config-reloader/0.log" Apr 24 21:54:02.575157 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.575137 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_23342931-f34f-4e4b-92c6-85da5c12481d/kube-rbac-proxy-web/0.log" Apr 24 21:54:02.605959 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.605918 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_23342931-f34f-4e4b-92c6-85da5c12481d/kube-rbac-proxy/0.log" Apr 24 21:54:02.634488 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.634466 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_23342931-f34f-4e4b-92c6-85da5c12481d/kube-rbac-proxy-metric/0.log" Apr 24 21:54:02.664531 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.664509 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_23342931-f34f-4e4b-92c6-85da5c12481d/prom-label-proxy/0.log" Apr 24 21:54:02.687045 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.687025 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_23342931-f34f-4e4b-92c6-85da5c12481d/init-config-reloader/0.log" Apr 24 21:54:02.716934 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.716908 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec46c47-92c3-4850-a317-ac67a644e567" path="/var/lib/kubelet/pods/5ec46c47-92c3-4850-a317-ac67a644e567/volumes" Apr 24 21:54:02.852587 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.852519 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-bkrnl_25135849-7d9d-4889-aa15-bd6bcbd9cf27/monitoring-plugin/0.log" Apr 24 21:54:02.964901 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.964882 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rm2gn_47714f4e-937b-411d-b7ba-1d3c613a90ab/node-exporter/0.log" Apr 24 21:54:02.990271 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:02.990254 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rm2gn_47714f4e-937b-411d-b7ba-1d3c613a90ab/kube-rbac-proxy/0.log" Apr 24 21:54:03.015447 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:03.015427 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rm2gn_47714f4e-937b-411d-b7ba-1d3c613a90ab/init-textfile/0.log" Apr 24 21:54:03.139374 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:03.139354 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-dd2lk_98a3c479-04bc-4dfa-850a-9146a8ebcda5/kube-rbac-proxy-main/0.log" Apr 24 21:54:03.163410 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:03.163389 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-dd2lk_98a3c479-04bc-4dfa-850a-9146a8ebcda5/kube-rbac-proxy-self/0.log" Apr 24 21:54:03.188670 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:03.188650 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-dd2lk_98a3c479-04bc-4dfa-850a-9146a8ebcda5/openshift-state-metrics/0.log" Apr 24 21:54:03.228197 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:03.228178 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c60e069f-41db-46e2-8df5-2fb3c07c7091/prometheus/0.log" Apr 24 21:54:03.251488 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:03.251463 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c60e069f-41db-46e2-8df5-2fb3c07c7091/config-reloader/0.log" Apr 24 21:54:03.277095 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:03.277078 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c60e069f-41db-46e2-8df5-2fb3c07c7091/thanos-sidecar/0.log" Apr 24 21:54:03.303018 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:03.303002 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c60e069f-41db-46e2-8df5-2fb3c07c7091/kube-rbac-proxy-web/0.log" Apr 24 21:54:03.326583 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:03.326564 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c60e069f-41db-46e2-8df5-2fb3c07c7091/kube-rbac-proxy/0.log" Apr 24 21:54:03.354457 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:03.354437 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c60e069f-41db-46e2-8df5-2fb3c07c7091/kube-rbac-proxy-thanos/0.log" Apr 24 21:54:03.379360 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:03.379343 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c60e069f-41db-46e2-8df5-2fb3c07c7091/init-config-reloader/0.log" Apr 24 21:54:03.494668 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:03.494580 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6569cfb4f9-h2stm_0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180/telemeter-client/0.log" Apr 24 21:54:03.522754 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:03.522725 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6569cfb4f9-h2stm_0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180/reload/0.log" Apr 24 21:54:03.552882 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:03.552841 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6569cfb4f9-h2stm_0efd4b5d-ceb1-4ef4-92cc-fcdbecb49180/kube-rbac-proxy/0.log" Apr 24 21:54:06.018517 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.018488 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs"] Apr 24 21:54:06.018955 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.018846 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ec46c47-92c3-4850-a317-ac67a644e567" containerName="gather" Apr 24 21:54:06.018955 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.018860 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec46c47-92c3-4850-a317-ac67a644e567" containerName="gather" Apr 24 21:54:06.018955 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.018870 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ec46c47-92c3-4850-a317-ac67a644e567" containerName="copy" Apr 24 21:54:06.018955 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.018880 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec46c47-92c3-4850-a317-ac67a644e567" containerName="copy" Apr 24 21:54:06.018955 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.018928 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ec46c47-92c3-4850-a317-ac67a644e567" containerName="gather" Apr 24 21:54:06.018955 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.018940 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ec46c47-92c3-4850-a317-ac67a644e567" containerName="copy" Apr 24 21:54:06.021668 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.021651 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:06.023873 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.023852 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fdcwc\"/\"openshift-service-ca.crt\"" Apr 24 21:54:06.023985 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.023855 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fdcwc\"/\"kube-root-ca.crt\"" Apr 24 21:54:06.024525 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.024496 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-fdcwc\"/\"default-dockercfg-gd25x\"" Apr 24 21:54:06.031563 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.031542 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs"] Apr 24 21:54:06.159176 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.159148 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/652e8a60-198f-4a30-8b80-ea5b383c37c9-sys\") pod \"perf-node-gather-daemonset-pckgs\" (UID: \"652e8a60-198f-4a30-8b80-ea5b383c37c9\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:06.159292 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.159175 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/652e8a60-198f-4a30-8b80-ea5b383c37c9-proc\") pod \"perf-node-gather-daemonset-pckgs\" (UID: \"652e8a60-198f-4a30-8b80-ea5b383c37c9\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:06.159292 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.159199 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/652e8a60-198f-4a30-8b80-ea5b383c37c9-podres\") pod \"perf-node-gather-daemonset-pckgs\" (UID: \"652e8a60-198f-4a30-8b80-ea5b383c37c9\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:06.159292 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.159270 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgxlx\" (UniqueName: \"kubernetes.io/projected/652e8a60-198f-4a30-8b80-ea5b383c37c9-kube-api-access-wgxlx\") pod \"perf-node-gather-daemonset-pckgs\" (UID: \"652e8a60-198f-4a30-8b80-ea5b383c37c9\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:06.159395 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.159306 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/652e8a60-198f-4a30-8b80-ea5b383c37c9-lib-modules\") pod \"perf-node-gather-daemonset-pckgs\" (UID: \"652e8a60-198f-4a30-8b80-ea5b383c37c9\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:06.260108 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.260082 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/652e8a60-198f-4a30-8b80-ea5b383c37c9-proc\") pod \"perf-node-gather-daemonset-pckgs\" (UID: \"652e8a60-198f-4a30-8b80-ea5b383c37c9\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:06.260247 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.260120 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/652e8a60-198f-4a30-8b80-ea5b383c37c9-podres\") pod \"perf-node-gather-daemonset-pckgs\" (UID: \"652e8a60-198f-4a30-8b80-ea5b383c37c9\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:06.260247 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.260155 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgxlx\" (UniqueName: \"kubernetes.io/projected/652e8a60-198f-4a30-8b80-ea5b383c37c9-kube-api-access-wgxlx\") pod \"perf-node-gather-daemonset-pckgs\" (UID: \"652e8a60-198f-4a30-8b80-ea5b383c37c9\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:06.260247 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.260190 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/652e8a60-198f-4a30-8b80-ea5b383c37c9-lib-modules\") pod \"perf-node-gather-daemonset-pckgs\" (UID: \"652e8a60-198f-4a30-8b80-ea5b383c37c9\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:06.260247 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.260206 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/652e8a60-198f-4a30-8b80-ea5b383c37c9-proc\") pod \"perf-node-gather-daemonset-pckgs\" (UID: \"652e8a60-198f-4a30-8b80-ea5b383c37c9\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:06.260441 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.260277 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/652e8a60-198f-4a30-8b80-ea5b383c37c9-podres\") pod \"perf-node-gather-daemonset-pckgs\" (UID: \"652e8a60-198f-4a30-8b80-ea5b383c37c9\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:06.260441 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.260297 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/652e8a60-198f-4a30-8b80-ea5b383c37c9-sys\") pod \"perf-node-gather-daemonset-pckgs\" (UID: \"652e8a60-198f-4a30-8b80-ea5b383c37c9\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:06.260441 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.260352 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/652e8a60-198f-4a30-8b80-ea5b383c37c9-lib-modules\") pod \"perf-node-gather-daemonset-pckgs\" (UID: \"652e8a60-198f-4a30-8b80-ea5b383c37c9\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:06.260441 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.260357 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/652e8a60-198f-4a30-8b80-ea5b383c37c9-sys\") pod \"perf-node-gather-daemonset-pckgs\" (UID: \"652e8a60-198f-4a30-8b80-ea5b383c37c9\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:06.269182 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.269135 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgxlx\" (UniqueName: \"kubernetes.io/projected/652e8a60-198f-4a30-8b80-ea5b383c37c9-kube-api-access-wgxlx\") pod \"perf-node-gather-daemonset-pckgs\" (UID: \"652e8a60-198f-4a30-8b80-ea5b383c37c9\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:06.332116 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.332097 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:06.445474 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.445451 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs"] Apr 24 21:54:06.448198 ip-10-0-142-162 kubenswrapper[2574]: W0424 21:54:06.448172 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod652e8a60_198f_4a30_8b80_ea5b383c37c9.slice/crio-4741d2a79ebbd7cae157e1276d33a613864e2607936605db5ef38b2f36c7c1d1 WatchSource:0}: Error finding container 4741d2a79ebbd7cae157e1276d33a613864e2607936605db5ef38b2f36c7c1d1: Status 404 returned error can't find the container with id 4741d2a79ebbd7cae157e1276d33a613864e2607936605db5ef38b2f36c7c1d1 Apr 24 21:54:06.963370 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.963346 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jtvxc_3854f8f7-804b-4511-a3f3-1b96449f8b70/dns/0.log" Apr 24 21:54:06.998277 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:06.998259 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jtvxc_3854f8f7-804b-4511-a3f3-1b96449f8b70/kube-rbac-proxy/0.log" Apr 24 21:54:07.087146 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:07.087124 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cwvxk_da82016d-3774-4430-881a-6479d2a7aa8c/dns-node-resolver/0.log" Apr 24 21:54:07.129128 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:07.129098 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" event={"ID":"652e8a60-198f-4a30-8b80-ea5b383c37c9","Type":"ContainerStarted","Data":"87f12d7f50bf964a0d48888fadfdfed47dc2b6a0bf782c41336a4e04c17da139"} Apr 24 21:54:07.129220 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:07.129132 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" event={"ID":"652e8a60-198f-4a30-8b80-ea5b383c37c9","Type":"ContainerStarted","Data":"4741d2a79ebbd7cae157e1276d33a613864e2607936605db5ef38b2f36c7c1d1"} Apr 24 21:54:07.129296 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:07.129273 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:07.147756 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:07.147717 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" podStartSLOduration=1.147705251 podStartE2EDuration="1.147705251s" podCreationTimestamp="2026-04-24 21:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:54:07.145459827 +0000 UTC m=+1521.016437340" watchObservedRunningTime="2026-04-24 21:54:07.147705251 +0000 UTC m=+1521.018682750" Apr 24 21:54:07.660207 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:07.660184 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cdrzz_2fc5cade-b0b3-414a-88b0-ae3c0348001f/node-ca/0.log" Apr 24 21:54:08.828894 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:08.828848 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-r8jwq_0ee47bd0-7e68-48fd-8896-e4693d5e8f21/serve-healthcheck-canary/0.log" Apr 24 21:54:09.387437 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:09.387408 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lrjh7_55b8274d-55b8-458d-848c-e205acc6cd3b/kube-rbac-proxy/0.log" Apr 24 21:54:09.409709 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:09.409688 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lrjh7_55b8274d-55b8-458d-848c-e205acc6cd3b/exporter/0.log" Apr 24 21:54:09.433002 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:09.432968 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lrjh7_55b8274d-55b8-458d-848c-e205acc6cd3b/extractor/0.log" Apr 24 21:54:11.503424 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:11.503395 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-sc46j_f6e13c60-3a84-487a-8769-775e65a5c40e/s3-init/0.log" Apr 24 21:54:13.140607 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:13.140578 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-pckgs" Apr 24 21:54:17.195632 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:17.195607 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wn7sd_bf6a1a97-e9a0-4091-b077-931e1415d0c5/kube-multus-additional-cni-plugins/0.log" Apr 24 21:54:17.220997 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:17.220973 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wn7sd_bf6a1a97-e9a0-4091-b077-931e1415d0c5/egress-router-binary-copy/0.log" Apr 24 21:54:17.245633 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:17.245617 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wn7sd_bf6a1a97-e9a0-4091-b077-931e1415d0c5/cni-plugins/0.log" Apr 24 21:54:17.273405 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:17.273389 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wn7sd_bf6a1a97-e9a0-4091-b077-931e1415d0c5/bond-cni-plugin/0.log" Apr 24 21:54:17.298349 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:17.298333 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wn7sd_bf6a1a97-e9a0-4091-b077-931e1415d0c5/routeoverride-cni/0.log" Apr 24 21:54:17.322164 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:17.322123 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wn7sd_bf6a1a97-e9a0-4091-b077-931e1415d0c5/whereabouts-cni-bincopy/0.log" Apr 24 21:54:17.345724 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:17.345705 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wn7sd_bf6a1a97-e9a0-4091-b077-931e1415d0c5/whereabouts-cni/0.log" Apr 24 21:54:17.590615 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:17.590504 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qqsnz_4dd2bc89-9e9b-4a45-b5fb-585ad0a71cd0/kube-multus/0.log" Apr 24 21:54:17.762740 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:17.762706 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tdnnb_fba6f53a-a544-4d53-ba11-2dd3b3259ed0/network-metrics-daemon/0.log" Apr 24 21:54:17.784464 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:17.784434 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tdnnb_fba6f53a-a544-4d53-ba11-2dd3b3259ed0/kube-rbac-proxy/0.log" Apr 24 21:54:18.520320 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:18.520292 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovn-controller/0.log" Apr 24 21:54:18.544566 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:18.544544 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovn-acl-logging/0.log" Apr 24 21:54:18.551871 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:18.551852 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovn-acl-logging/1.log" Apr 24 21:54:18.571537 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:18.571518 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/kube-rbac-proxy-node/0.log" Apr 24 21:54:18.593858 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:18.593833 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 21:54:18.619290 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:18.619273 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/northd/0.log" Apr 24 21:54:18.643173 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:18.643153 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/nbdb/0.log" Apr 24 21:54:18.666083 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:18.666056 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/sbdb/0.log" Apr 24 21:54:18.757447 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:18.757424 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-256cw_3e1b294d-b645-40e3-b659-41031123c7f2/ovnkube-controller/0.log" Apr 24 21:54:20.393970 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:20.393942 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-vtshd_ab4b7ffb-2c75-4a6a-b8f4-287b1b3bb374/network-check-target-container/0.log" Apr 24 21:54:21.270608 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:21.270583 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-d8cnv_39fba077-f532-47c2-b634-29e01862bef6/iptables-alerter/0.log" Apr 24 21:54:21.978927 ip-10-0-142-162 kubenswrapper[2574]: I0424 21:54:21.978888 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-fkzj2_5a521b1a-3dde-4f1e-aa52-3728d09e9921/tuned/0.log"