Apr 21 06:26:02.369183 ip-10-0-138-68 systemd[1]: Starting Kubernetes Kubelet... Apr 21 06:26:02.819240 ip-10-0-138-68 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 06:26:02.819240 ip-10-0-138-68 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 06:26:02.819240 ip-10-0-138-68 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 06:26:02.819240 ip-10-0-138-68 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 06:26:02.819240 ip-10-0-138-68 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 06:26:02.821132 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.821043 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 06:26:02.827224 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827206 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 06:26:02.827264 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827225 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 06:26:02.827264 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827230 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 06:26:02.827264 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827233 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 06:26:02.827264 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827236 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 06:26:02.827264 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827239 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 06:26:02.827264 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827242 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 06:26:02.827264 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827244 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 06:26:02.827264 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827247 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 06:26:02.827264 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827250 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 06:26:02.827264 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827253 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 06:26:02.827264 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827255 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 06:26:02.827264 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827258 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 06:26:02.827264 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827261 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 06:26:02.827264 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827263 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 06:26:02.827264 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827266 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 06:26:02.827264 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827269 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 06:26:02.827264 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827272 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827280 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827283 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827285 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827288 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827291 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827294 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827296 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827299 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827301 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827304 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827306 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827309 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827312 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827314 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827317 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827320 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827322 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827324 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827327 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 06:26:02.827719 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827329 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827332 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827335 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827337 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827340 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827343 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827345 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827348 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827350 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827353 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827355 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827358 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827362 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827366 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827369 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827372 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827375 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827377 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827380 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 06:26:02.828226 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827382 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827385 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827387 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827390 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827392 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827394 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827397 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827400 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827402 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827405 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827407 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827410 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827413 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827415 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827417 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827420 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827423 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827425 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827430 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827432 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 06:26:02.828682 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827435 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827437 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827440 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827442 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827445 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827447 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827450 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827453 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827463 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827466 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827874 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827879 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827882 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827885 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827888 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827890 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827893 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827896 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827898 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827901 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 06:26:02.829229 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827904 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827906 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827909 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827912 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827914 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827917 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827920 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827922 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827925 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827927 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827931 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827934 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827936 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827938 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827941 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827944 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827946 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827951 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827955 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 06:26:02.829707 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827959 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827961 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827964 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827966 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827970 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827973 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827975 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827978 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827981 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827983 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827986 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827988 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827991 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827993 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827996 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.827998 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828001 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828003 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828006 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828008 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 06:26:02.830198 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828011 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828013 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828016 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828018 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828021 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828024 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828027 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828029 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828032 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828034 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828038 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828042 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828046 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828049 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828052 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828055 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828057 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828060 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828063 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828065 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 06:26:02.830694 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828067 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828070 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828073 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828075 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828078 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828080 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828083 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828085 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828088 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828091 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828093 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828097 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828100 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828102 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828105 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828107 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.828110 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829505 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829515 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829523 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 06:26:02.831195 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829528 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829532 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829536 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829541 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829545 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829549 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829552 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829555 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829558 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829562 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829565 2570 flags.go:64] FLAG: --cgroup-root="" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829568 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829571 2570 flags.go:64] FLAG: --client-ca-file="" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829574 2570 flags.go:64] FLAG: --cloud-config="" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829577 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829580 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829584 2570 flags.go:64] FLAG: --cluster-domain="" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829588 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829591 2570 flags.go:64] FLAG: --config-dir="" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829594 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829597 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829601 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829604 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829608 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 06:26:02.831685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829612 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829615 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829618 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829621 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829624 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829628 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829632 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829635 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829638 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829641 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829644 2570 flags.go:64] FLAG: --enable-server="true" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829647 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829652 2570 flags.go:64] FLAG: --event-burst="100" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829655 2570 flags.go:64] FLAG: --event-qps="50" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829658 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829661 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829664 2570 flags.go:64] FLAG: --eviction-hard="" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829668 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829671 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829674 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829677 2570 flags.go:64] FLAG: --eviction-soft="" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829680 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829683 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829687 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829690 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 06:26:02.832287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829693 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829696 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829699 2570 flags.go:64] FLAG: --feature-gates="" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829703 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829706 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829709 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829713 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829716 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829719 2570 flags.go:64] FLAG: --help="false" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829722 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-138-68.ec2.internal" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829726 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829729 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829732 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829735 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829739 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829742 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829745 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829748 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829751 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829754 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829757 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829760 2570 flags.go:64] FLAG: --kube-reserved="" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829762 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829766 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 06:26:02.832910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829770 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829772 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829776 2570 flags.go:64] FLAG: --lock-file="" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829779 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829782 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829785 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829790 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829793 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829796 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829799 2570 flags.go:64] FLAG: --logging-format="text" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829802 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829805 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829808 2570 flags.go:64] FLAG: --manifest-url="" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829811 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829821 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829824 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829828 2570 flags.go:64] FLAG: --max-pods="110" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829831 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829834 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829837 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829840 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829843 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829846 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829863 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829871 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 06:26:02.833504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829875 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829878 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829881 2570 flags.go:64] FLAG: --pod-cidr="" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829884 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829890 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829893 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829896 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829899 2570 flags.go:64] FLAG: --port="10250" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829902 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829905 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c3f85aa294ba1f0d" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829908 2570 flags.go:64] FLAG: --qos-reserved="" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829911 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829914 2570 flags.go:64] FLAG: --register-node="true" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829917 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829920 2570 flags.go:64] FLAG: --register-with-taints="" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829924 2570 flags.go:64] FLAG: --registry-burst="10" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829926 2570 flags.go:64] FLAG: --registry-qps="5" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829929 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829932 2570 flags.go:64] FLAG: --reserved-memory="" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829935 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829938 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829942 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829945 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829947 2570 flags.go:64] FLAG: --runonce="false" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829950 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 06:26:02.834145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829953 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829957 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829960 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829962 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829965 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829968 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829972 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829975 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829978 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829981 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829984 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829987 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829990 2570 flags.go:64] FLAG: --system-cgroups="" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829992 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.829998 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.830001 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.830004 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.830008 2570 flags.go:64] FLAG: --tls-min-version="" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.830011 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.830013 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.830016 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.830019 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.830022 2570 flags.go:64] FLAG: --v="2" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.830027 2570 flags.go:64] FLAG: --version="false" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.830032 2570 flags.go:64] FLAG: --vmodule="" Apr 21 06:26:02.834749 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.830036 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.830039 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830130 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830134 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830138 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830141 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830143 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830146 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830148 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830151 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830153 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830157 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830160 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830163 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830167 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830170 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830172 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830175 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830179 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830183 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 06:26:02.835366 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830186 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830189 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830192 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830195 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830198 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830201 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830204 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830206 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830209 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830212 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830215 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830218 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830220 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830223 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830225 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830228 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830230 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830233 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830235 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830238 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 06:26:02.835841 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830240 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830243 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830245 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830248 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830250 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830253 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830255 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830258 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830261 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830263 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830266 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830269 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830274 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830277 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830279 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830282 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830284 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830287 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830289 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 06:26:02.836384 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830292 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830294 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830297 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830299 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830301 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830304 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830307 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830311 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830313 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830316 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830319 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830321 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830324 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830327 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830329 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830331 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830334 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830336 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830339 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830341 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 06:26:02.836906 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830344 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 06:26:02.837589 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830347 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 06:26:02.837589 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830349 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 06:26:02.837589 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830352 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 06:26:02.837589 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830354 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 06:26:02.837589 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830357 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 06:26:02.837589 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830360 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 06:26:02.837589 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830363 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 06:26:02.837589 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.830365 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 06:26:02.837589 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.831134 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 06:26:02.839794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.839776 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 06:26:02.839794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.839795 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 06:26:02.839891 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839846 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 06:26:02.839891 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839850 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 06:26:02.839891 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839867 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 06:26:02.839891 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839871 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 06:26:02.839891 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839874 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 06:26:02.839891 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839877 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 06:26:02.839891 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839880 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 06:26:02.839891 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839884 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 06:26:02.839891 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839887 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 06:26:02.839891 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839890 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 06:26:02.839891 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839893 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839898 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839903 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839906 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839909 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839911 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839914 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839917 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839919 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839922 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839924 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839928 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839930 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839933 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839936 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839938 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839941 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839943 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839946 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839948 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 06:26:02.840168 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839951 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839954 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839956 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839958 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839961 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839964 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839966 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839968 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839971 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839974 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839976 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839979 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839981 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839984 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839987 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839990 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839992 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839995 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.839997 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 06:26:02.840661 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840001 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840004 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840007 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840010 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840013 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840015 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840017 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840020 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840022 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840025 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840027 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840030 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840033 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840035 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840038 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840040 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840043 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840045 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840048 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840050 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 06:26:02.841147 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840053 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 06:26:02.841633 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840055 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 06:26:02.841633 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840058 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 06:26:02.841633 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840061 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 06:26:02.841633 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840063 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 06:26:02.841633 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840066 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 06:26:02.841633 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840069 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 06:26:02.841633 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840072 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 06:26:02.841633 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840075 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 06:26:02.841633 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840077 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 06:26:02.841633 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840080 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 06:26:02.841633 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840083 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 06:26:02.841633 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840085 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 06:26:02.841633 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840088 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 06:26:02.841633 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840090 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 06:26:02.841633 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840093 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 06:26:02.841633 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840095 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 06:26:02.842024 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.840100 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 06:26:02.842024 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840195 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 06:26:02.842024 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840201 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 06:26:02.842024 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840204 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 06:26:02.842024 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840207 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 06:26:02.842024 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840209 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 06:26:02.842024 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840212 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 06:26:02.842024 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840215 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 06:26:02.842024 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840218 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 06:26:02.842024 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840221 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 06:26:02.842024 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840224 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 06:26:02.842024 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840227 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 06:26:02.842024 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840230 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 06:26:02.842024 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840232 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 06:26:02.842024 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840235 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840237 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840240 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840242 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840245 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840247 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840250 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840252 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840255 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840258 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840262 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840266 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840269 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840271 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840274 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840276 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840279 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840282 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840284 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 06:26:02.842396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840287 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840289 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840292 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840294 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840297 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840299 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840302 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840304 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840307 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840309 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840312 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840314 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840317 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840319 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840322 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840324 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840327 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840329 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840332 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840334 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 06:26:02.842967 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840337 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840339 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840342 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840345 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840347 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840350 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840353 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840355 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840358 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840360 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840362 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840365 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840368 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840370 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840373 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840375 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840378 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840381 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840383 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840386 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 06:26:02.843457 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840388 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 06:26:02.843947 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840391 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 06:26:02.843947 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840393 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 06:26:02.843947 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840396 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 06:26:02.843947 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840398 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 06:26:02.843947 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840401 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 06:26:02.843947 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840405 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 06:26:02.843947 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840409 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 06:26:02.843947 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840411 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 06:26:02.843947 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840414 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 06:26:02.843947 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840417 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 06:26:02.843947 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840420 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 06:26:02.843947 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840423 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 06:26:02.843947 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:02.840425 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 06:26:02.843947 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.840430 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 06:26:02.843947 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.841179 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 06:26:02.844872 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.844846 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 06:26:02.845862 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.845841 2570 server.go:1019] "Starting client certificate rotation" Apr 21 06:26:02.845966 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.845947 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 06:26:02.846002 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.845995 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 06:26:02.872715 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.872687 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 06:26:02.877841 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.877820 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 06:26:02.893943 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.893924 2570 log.go:25] "Validated CRI v1 runtime API" Apr 21 06:26:02.899498 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.899485 2570 log.go:25] "Validated CRI v1 image API" Apr 21 06:26:02.900709 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.900694 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 06:26:02.901967 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.901951 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 06:26:02.904567 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.904547 2570 fs.go:135] Filesystem UUIDs: map[51a5e01e-671e-4aec-904b-c9bbb1b22dd9:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 bb33befb-fb30-436d-903c-0bf58955325b:/dev/nvme0n1p4] Apr 21 06:26:02.904636 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.904567 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 06:26:02.911808 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.911696 2570 manager.go:217] Machine: {Timestamp:2026-04-21 06:26:02.909485482 +0000 UTC m=+0.415950386 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098801 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec299597da5b21ec97cc5f2410830187 SystemUUID:ec299597-da5b-21ec-97cc-5f2410830187 BootID:4a0e1f35-1794-479e-b142-c815dc7d1aff Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ec:63:48:98:83 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ec:63:48:98:83 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:aa:ac:3b:75:3d:09 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 06:26:02.911808 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.911796 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 06:26:02.911989 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.911926 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 06:26:02.913016 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.912991 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 06:26:02.913188 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.913019 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-68.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 06:26:02.913274 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.913204 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 06:26:02.913274 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.913218 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 06:26:02.913274 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.913236 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 06:26:02.914113 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.914102 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 06:26:02.915514 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.915502 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 21 06:26:02.915813 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.915801 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 06:26:02.918164 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.918153 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 21 06:26:02.918221 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.918171 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 06:26:02.918221 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.918186 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 06:26:02.918221 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.918200 2570 kubelet.go:397] "Adding apiserver pod source" Apr 21 06:26:02.918221 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.918213 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 06:26:02.919415 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.919402 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 06:26:02.919490 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.919425 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 06:26:02.922570 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.922556 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 06:26:02.924060 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.924046 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 06:26:02.926053 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.926041 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 06:26:02.926098 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.926059 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 06:26:02.926098 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.926065 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 06:26:02.926098 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.926074 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 06:26:02.926098 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.926084 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 06:26:02.926098 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.926090 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 06:26:02.926098 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.926095 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 06:26:02.926256 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.926101 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 06:26:02.926256 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.926109 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 06:26:02.926256 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.926115 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 06:26:02.926256 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.926132 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 06:26:02.926256 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.926141 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 06:26:02.927903 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.927888 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 06:26:02.927903 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.927904 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 06:26:02.929575 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:02.929553 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 06:26:02.929631 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:02.929603 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-68.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 06:26:02.931918 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.931905 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 06:26:02.931991 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.931946 2570 server.go:1295] "Started kubelet" Apr 21 06:26:02.932071 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.932028 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 06:26:02.932123 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.932092 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 06:26:02.932171 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.932143 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 06:26:02.932742 ip-10-0-138-68 systemd[1]: Started Kubernetes Kubelet. Apr 21 06:26:02.933404 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.933392 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 06:26:02.934046 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.934009 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 21 06:26:02.938971 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.938956 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-68.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 06:26:02.939974 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:02.939068 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-68.ec2.internal.18a84b3968403199 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-68.ec2.internal,UID:ip-10-0-138-68.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-68.ec2.internal,},FirstTimestamp:2026-04-21 06:26:02.931917209 +0000 UTC m=+0.438382113,LastTimestamp:2026-04-21 06:26:02.931917209 +0000 UTC m=+0.438382113,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-68.ec2.internal,}" Apr 21 06:26:02.940286 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.940271 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 06:26:02.940825 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.940806 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 06:26:02.941553 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.941534 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 06:26:02.941735 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.941720 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 06:26:02.941799 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.941740 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 06:26:02.941905 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.941885 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 21 06:26:02.941905 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.941899 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 21 06:26:02.942049 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.941920 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 06:26:02.942049 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.941935 2570 factory.go:55] Registering systemd factory Apr 21 06:26:02.942049 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.941943 2570 factory.go:223] Registration of the systemd container factory successfully Apr 21 06:26:02.942049 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:02.942037 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 21 06:26:02.942234 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:02.942152 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 06:26:02.942312 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.942236 2570 factory.go:153] Registering CRI-O factory Apr 21 06:26:02.942312 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.942249 2570 factory.go:223] Registration of the crio container factory successfully Apr 21 06:26:02.942312 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.942274 2570 factory.go:103] Registering Raw factory Apr 21 06:26:02.942312 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.942288 2570 manager.go:1196] Started watching for new ooms in manager Apr 21 06:26:02.942694 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.942663 2570 manager.go:319] Starting recovery of all containers Apr 21 06:26:02.944142 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.944117 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xxzxv" Apr 21 06:26:02.950061 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:02.949884 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-68.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 06:26:02.950167 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:02.949843 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 06:26:02.951543 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.951522 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xxzxv" Apr 21 06:26:02.955108 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.955093 2570 manager.go:324] Recovery completed Apr 21 06:26:02.959184 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.959172 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 06:26:02.961662 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.961646 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientMemory" Apr 21 06:26:02.961717 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.961673 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 06:26:02.961717 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.961683 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientPID" Apr 21 06:26:02.962234 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.962220 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 06:26:02.962234 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.962233 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 06:26:02.962321 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.962250 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 21 06:26:02.963601 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:02.963539 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-68.ec2.internal.18a84b396a060b21 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-68.ec2.internal,UID:ip-10-0-138-68.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-68.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-68.ec2.internal,},FirstTimestamp:2026-04-21 06:26:02.961660705 +0000 UTC m=+0.468125609,LastTimestamp:2026-04-21 06:26:02.961660705 +0000 UTC m=+0.468125609,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-68.ec2.internal,}" Apr 21 06:26:02.964503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.964492 2570 policy_none.go:49] "None policy: Start" Apr 21 06:26:02.964554 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.964508 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 06:26:02.964554 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:02.964518 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 21 06:26:03.002497 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.002478 2570 manager.go:341] "Starting Device Plugin manager" Apr 21 06:26:03.012237 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:03.002526 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 06:26:03.012237 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.002539 2570 server.go:85] "Starting device plugin registration server" Apr 21 06:26:03.012237 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.002771 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 06:26:03.012237 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.002782 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 06:26:03.012237 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.002903 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 06:26:03.012237 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.002978 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 06:26:03.012237 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.002986 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 06:26:03.012237 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:03.003536 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 06:26:03.012237 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:03.003573 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-68.ec2.internal\" not found" Apr 21 06:26:03.076453 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.076382 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 06:26:03.077642 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.077625 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 06:26:03.077699 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.077659 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 06:26:03.077699 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.077684 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 06:26:03.077699 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.077693 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 06:26:03.077824 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:03.077736 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 06:26:03.079989 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.079966 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 06:26:03.103929 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.103901 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 06:26:03.104947 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.104932 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientMemory" Apr 21 06:26:03.105060 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.104969 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 06:26:03.105060 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.104984 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientPID" Apr 21 06:26:03.105060 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.105012 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-68.ec2.internal" Apr 21 06:26:03.111200 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.111185 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-68.ec2.internal" Apr 21 06:26:03.111272 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:03.111207 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-68.ec2.internal\": node \"ip-10-0-138-68.ec2.internal\" not found" Apr 21 06:26:03.125268 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:03.125247 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 21 06:26:03.178718 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.178691 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal"] Apr 21 06:26:03.178839 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.178760 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 06:26:03.179602 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.179590 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientMemory" Apr 21 06:26:03.179656 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.179619 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 06:26:03.179656 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.179634 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientPID" Apr 21 06:26:03.181106 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.181093 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 06:26:03.181246 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.181234 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" Apr 21 06:26:03.181283 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.181263 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 06:26:03.181741 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.181727 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientMemory" Apr 21 06:26:03.181820 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.181752 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 06:26:03.181820 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.181766 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientPID" Apr 21 06:26:03.182166 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.182139 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientMemory" Apr 21 06:26:03.182166 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.182164 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 06:26:03.182297 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.182176 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientPID" Apr 21 06:26:03.183356 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.183343 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal" Apr 21 06:26:03.183447 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.183366 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 06:26:03.184033 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.184018 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientMemory" Apr 21 06:26:03.184090 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.184046 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 06:26:03.184090 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.184058 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientPID" Apr 21 06:26:03.209564 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:03.209546 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-68.ec2.internal\" not found" node="ip-10-0-138-68.ec2.internal" Apr 21 06:26:03.213884 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:03.213868 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-68.ec2.internal\" not found" node="ip-10-0-138-68.ec2.internal" Apr 21 06:26:03.226092 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:03.226077 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 21 06:26:03.326725 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:03.326629 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 21 06:26:03.342952 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.342925 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/25dc2f6e99d2525192843ee005a28c4f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal\" (UID: \"25dc2f6e99d2525192843ee005a28c4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" Apr 21 06:26:03.343019 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.342953 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25dc2f6e99d2525192843ee005a28c4f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal\" (UID: \"25dc2f6e99d2525192843ee005a28c4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" Apr 21 06:26:03.343019 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.342973 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9475ce23d467a37e0480df7597bbc574-config\") pod \"kube-apiserver-proxy-ip-10-0-138-68.ec2.internal\" (UID: \"9475ce23d467a37e0480df7597bbc574\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal" Apr 21 06:26:03.427339 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:03.427293 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 21 06:26:03.443681 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.443655 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/25dc2f6e99d2525192843ee005a28c4f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal\" (UID: \"25dc2f6e99d2525192843ee005a28c4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" Apr 21 06:26:03.443762 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.443686 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25dc2f6e99d2525192843ee005a28c4f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal\" (UID: \"25dc2f6e99d2525192843ee005a28c4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" Apr 21 06:26:03.443762 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.443702 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9475ce23d467a37e0480df7597bbc574-config\") pod \"kube-apiserver-proxy-ip-10-0-138-68.ec2.internal\" (UID: \"9475ce23d467a37e0480df7597bbc574\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal" Apr 21 06:26:03.443762 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.443729 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9475ce23d467a37e0480df7597bbc574-config\") pod \"kube-apiserver-proxy-ip-10-0-138-68.ec2.internal\" (UID: \"9475ce23d467a37e0480df7597bbc574\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal" Apr 21 06:26:03.443902 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.443759 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25dc2f6e99d2525192843ee005a28c4f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal\" (UID: \"25dc2f6e99d2525192843ee005a28c4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" Apr 21 06:26:03.443902 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.443761 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/25dc2f6e99d2525192843ee005a28c4f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal\" (UID: \"25dc2f6e99d2525192843ee005a28c4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" Apr 21 06:26:03.511832 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.511793 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" Apr 21 06:26:03.517424 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.517401 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal" Apr 21 06:26:03.528213 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:03.528189 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 21 06:26:03.629413 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:03.629302 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 21 06:26:03.729913 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:03.729847 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 21 06:26:03.830393 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:03.830348 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 21 06:26:03.846802 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.846777 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 06:26:03.846984 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.846965 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 06:26:03.931382 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:03.931359 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 21 06:26:03.940665 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.940617 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 06:26:03.956212 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.956178 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 06:21:02 +0000 UTC" deadline="2027-12-22 08:48:11.537282213 +0000 UTC" Apr 21 06:26:03.956212 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.956208 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14642h22m7.581078084s" Apr 21 06:26:03.960759 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.960730 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 06:26:03.975253 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.975234 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 06:26:03.979443 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.979427 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wjm92" Apr 21 06:26:03.986718 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:03.986698 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wjm92" Apr 21 06:26:04.018488 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:04.018453 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9475ce23d467a37e0480df7597bbc574.slice/crio-93fa3eeef25595b073a49ad7a118234da921116f54a8eed024f443aa55f7dc16 WatchSource:0}: Error finding container 93fa3eeef25595b073a49ad7a118234da921116f54a8eed024f443aa55f7dc16: Status 404 returned error can't find the container with id 93fa3eeef25595b073a49ad7a118234da921116f54a8eed024f443aa55f7dc16 Apr 21 06:26:04.021957 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:04.021930 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25dc2f6e99d2525192843ee005a28c4f.slice/crio-5ea40c0e3d149cbb1539d56208ae8690a7c951b8b84bdb71633022a5ae22b045 WatchSource:0}: Error finding container 5ea40c0e3d149cbb1539d56208ae8690a7c951b8b84bdb71633022a5ae22b045: Status 404 returned error can't find the container with id 5ea40c0e3d149cbb1539d56208ae8690a7c951b8b84bdb71633022a5ae22b045 Apr 21 06:26:04.023237 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.023217 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 06:26:04.042036 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.041697 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" Apr 21 06:26:04.052737 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.052708 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 06:26:04.054635 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.054620 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal" Apr 21 06:26:04.059889 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.059877 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 06:26:04.080756 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.080702 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" event={"ID":"25dc2f6e99d2525192843ee005a28c4f","Type":"ContainerStarted","Data":"5ea40c0e3d149cbb1539d56208ae8690a7c951b8b84bdb71633022a5ae22b045"} Apr 21 06:26:04.081585 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.081563 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal" event={"ID":"9475ce23d467a37e0480df7597bbc574","Type":"ContainerStarted","Data":"93fa3eeef25595b073a49ad7a118234da921116f54a8eed024f443aa55f7dc16"} Apr 21 06:26:04.352397 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.352311 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 06:26:04.532089 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.532028 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 06:26:04.920054 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.920012 2570 apiserver.go:52] "Watching apiserver" Apr 21 06:26:04.927130 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.927105 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 06:26:04.927538 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.927513 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-78qjr","kube-system/konnectivity-agent-gh2h2","kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz","openshift-cluster-node-tuning-operator/tuned-7fmjt","openshift-dns/node-resolver-8z4fb","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal","openshift-multus/multus-additional-cni-plugins-9kgs6","openshift-multus/multus-bngnm","kube-system/global-pull-secret-syncer-frtp9","openshift-image-registry/node-ca-z2xrz","openshift-multus/network-metrics-daemon-xhdsz","openshift-network-diagnostics/network-check-target-pfzbp","openshift-network-operator/iptables-alerter-kwxms"] Apr 21 06:26:04.929314 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.929288 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:04.929413 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:04.929360 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frtp9" podUID="f9a14c21-e359-4c20-95a5-948922cc3ff8" Apr 21 06:26:04.930440 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.930422 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:04.930511 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:04.930485 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pfzbp" podUID="aba5693c-c88c-45ce-9751-0d5e014097eb" Apr 21 06:26:04.931740 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.931718 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kwxms" Apr 21 06:26:04.933297 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.933116 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:04.933772 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.933749 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 06:26:04.933885 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.933795 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 06:26:04.933885 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.933828 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 06:26:04.933885 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.933830 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6rn2f\"" Apr 21 06:26:04.934387 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.934373 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:04.934984 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.934965 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 06:26:04.935069 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.934969 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-j8zxn\"" Apr 21 06:26:04.935281 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.935264 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 06:26:04.935400 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.935381 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 06:26:04.935741 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.935719 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8z4fb" Apr 21 06:26:04.936444 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.936423 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 06:26:04.936444 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.936439 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 06:26:04.936643 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.936455 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lb22k\"" Apr 21 06:26:04.937056 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.937040 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bngnm" Apr 21 06:26:04.937447 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.937428 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 06:26:04.937783 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.937767 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2nkqr\"" Apr 21 06:26:04.937897 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.937798 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 06:26:04.938425 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.938405 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:04.938914 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.938898 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 06:26:04.939346 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.939330 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 06:26:04.939723 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.939574 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 06:26:04.939723 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.939597 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 06:26:04.939723 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.939603 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-h2gmf\"" Apr 21 06:26:04.939993 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.939822 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gh2h2" Apr 21 06:26:04.940313 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.940259 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 06:26:04.941418 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.941401 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.944576 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.942291 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 06:26:04.944576 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.942570 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-9dlfm\"" Apr 21 06:26:04.944576 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.943305 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-x4kcr\"" Apr 21 06:26:04.944576 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.943405 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 06:26:04.944576 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.943406 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 06:26:04.944576 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.943953 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 06:26:04.944928 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.944663 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vgk92\"" Apr 21 06:26:04.944928 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.944720 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-z2xrz" Apr 21 06:26:04.945022 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.944986 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 06:26:04.945386 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.945358 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 06:26:04.946937 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.946390 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 06:26:04.946937 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.946642 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 06:26:04.946937 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.946653 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 06:26:04.947175 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.947156 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 06:26:04.947948 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.947933 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 06:26:04.948035 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.948004 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:04.948096 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:04.948065 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhdsz" podUID="d0022157-8720-4a4c-8cf0-324fe8cb0e3f" Apr 21 06:26:04.948383 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.948364 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 06:26:04.949395 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.949378 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-284kh\"" Apr 21 06:26:04.950795 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.950778 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.950893 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.950803 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74dcd627-03e5-412a-b898-6f771a157832-cni-binary-copy\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:04.950893 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.950819 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dec8e610-c059-4e43-8e86-18a73c970319-konnectivity-ca\") pod \"konnectivity-agent-gh2h2\" (UID: \"dec8e610-c059-4e43-8e86-18a73c970319\") " pod="kube-system/konnectivity-agent-gh2h2" Apr 21 06:26:04.950893 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.950836 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f9a14c21-e359-4c20-95a5-948922cc3ff8-dbus\") pod \"global-pull-secret-syncer-frtp9\" (UID: \"f9a14c21-e359-4c20-95a5-948922cc3ff8\") " pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:04.950893 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.950883 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0814d57e-a465-4787-8668-7b52f9ae671d-device-dir\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:04.951102 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.950909 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-cni-netd\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.951102 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.950932 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfc3125e-919d-4ff6-add5-623ba583cd1a-tmp\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:04.951102 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.950973 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-host-var-lib-kubelet\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:04.951102 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951015 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-log-socket\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.951102 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951041 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-kubernetes\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:04.951102 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951073 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt2lf\" (UniqueName: \"kubernetes.io/projected/bfc3125e-919d-4ff6-add5-623ba583cd1a-kube-api-access-lt2lf\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:04.951102 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951094 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzq6l\" (UniqueName: \"kubernetes.io/projected/74dcd627-03e5-412a-b898-6f771a157832-kube-api-access-vzq6l\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:04.951413 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951111 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ada9d0db-7f80-4159-9e92-7fe71d0647f6-serviceca\") pod \"node-ca-z2xrz\" (UID: \"ada9d0db-7f80-4159-9e92-7fe71d0647f6\") " pod="openshift-image-registry/node-ca-z2xrz" Apr 21 06:26:04.951413 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951126 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpn76\" (UniqueName: \"kubernetes.io/projected/ada9d0db-7f80-4159-9e92-7fe71d0647f6-kube-api-access-vpn76\") pod \"node-ca-z2xrz\" (UID: \"ada9d0db-7f80-4159-9e92-7fe71d0647f6\") " pod="openshift-image-registry/node-ca-z2xrz" Apr 21 06:26:04.951413 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951150 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-run-netns\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.951413 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951168 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-tuned\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:04.951413 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951191 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aa39b975-a320-4be6-9871-173b44b3bf1a-cnibin\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:04.951413 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951220 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aa39b975-a320-4be6-9871-173b44b3bf1a-os-release\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:04.951413 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951259 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4860a8de-8ebf-4c37-b025-9aaf165b999b-ovnkube-config\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.951413 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951296 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4860a8de-8ebf-4c37-b025-9aaf165b999b-env-overrides\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.951413 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951323 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-sysconfig\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:04.951413 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951346 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-sysctl-d\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:04.951413 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951368 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-system-cni-dir\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:04.951413 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951391 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a1478fba-f9dc-413e-8354-ffdd0bcdaed2-iptables-alerter-script\") pod \"iptables-alerter-kwxms\" (UID: \"a1478fba-f9dc-413e-8354-ffdd0bcdaed2\") " pod="openshift-network-operator/iptables-alerter-kwxms" Apr 21 06:26:04.952017 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951428 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mfgf\" (UniqueName: \"kubernetes.io/projected/aa39b975-a320-4be6-9871-173b44b3bf1a-kube-api-access-8mfgf\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:04.952017 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951455 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-modprobe-d\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:04.952017 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951482 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aa39b975-a320-4be6-9871-173b44b3bf1a-cni-binary-copy\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:04.952017 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951506 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aa39b975-a320-4be6-9871-173b44b3bf1a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:04.952017 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951528 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-run-openvswitch\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.952017 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951552 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ea78721d-4fb0-4884-9dfd-d0be9bbc750b-tmp-dir\") pod \"node-resolver-8z4fb\" (UID: \"ea78721d-4fb0-4884-9dfd-d0be9bbc750b\") " pod="openshift-dns/node-resolver-8z4fb" Apr 21 06:26:04.952017 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951574 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dec8e610-c059-4e43-8e86-18a73c970319-agent-certs\") pod \"konnectivity-agent-gh2h2\" (UID: \"dec8e610-c059-4e43-8e86-18a73c970319\") " pod="kube-system/konnectivity-agent-gh2h2" Apr 21 06:26:04.952017 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951597 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69rg7\" (UniqueName: \"kubernetes.io/projected/0814d57e-a465-4787-8668-7b52f9ae671d-kube-api-access-69rg7\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:04.952017 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951619 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-host-run-netns\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:04.952017 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951642 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f9a14c21-e359-4c20-95a5-948922cc3ff8-kubelet-config\") pod \"global-pull-secret-syncer-frtp9\" (UID: \"f9a14c21-e359-4c20-95a5-948922cc3ff8\") " pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:04.952017 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951666 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-node-log\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.952017 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951699 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4860a8de-8ebf-4c37-b025-9aaf165b999b-ovn-node-metrics-cert\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.952017 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951748 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nqk5\" (UniqueName: \"kubernetes.io/projected/4860a8de-8ebf-4c37-b025-9aaf165b999b-kube-api-access-8nqk5\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.952017 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951772 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-lib-modules\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:04.952017 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951801 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-multus-socket-dir-parent\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:04.952017 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951825 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-etc-kubernetes\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:04.952655 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951848 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1478fba-f9dc-413e-8354-ffdd0bcdaed2-host-slash\") pod \"iptables-alerter-kwxms\" (UID: \"a1478fba-f9dc-413e-8354-ffdd0bcdaed2\") " pod="openshift-network-operator/iptables-alerter-kwxms" Apr 21 06:26:04.952655 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951885 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0814d57e-a465-4787-8668-7b52f9ae671d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:04.952655 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951902 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0814d57e-a465-4787-8668-7b52f9ae671d-registration-dir\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:04.952655 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951936 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0814d57e-a465-4787-8668-7b52f9ae671d-etc-selinux\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:04.952655 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.951983 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-run-systemd\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.952655 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952008 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-var-lib-openvswitch\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.952655 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952033 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-run\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:04.952655 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952055 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-sys\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:04.952655 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952087 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-os-release\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:04.952655 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952108 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-cni-bin\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.952655 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952122 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4860a8de-8ebf-4c37-b025-9aaf165b999b-ovnkube-script-lib\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.952655 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952144 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsfgk\" (UniqueName: \"kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk\") pod \"network-check-target-pfzbp\" (UID: \"aba5693c-c88c-45ce-9751-0d5e014097eb\") " pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:04.952655 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952167 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-sysctl-conf\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:04.952655 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952188 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-systemd\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:04.952655 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952201 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-host\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:04.952655 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952220 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kwnk\" (UniqueName: \"kubernetes.io/projected/ea78721d-4fb0-4884-9dfd-d0be9bbc750b-kube-api-access-6kwnk\") pod \"node-resolver-8z4fb\" (UID: \"ea78721d-4fb0-4884-9dfd-d0be9bbc750b\") " pod="openshift-dns/node-resolver-8z4fb" Apr 21 06:26:04.953294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952244 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-cnibin\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:04.953294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952282 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-etc-openvswitch\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.953294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952304 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-host-var-lib-cni-bin\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:04.953294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952325 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ada9d0db-7f80-4159-9e92-7fe71d0647f6-host\") pod \"node-ca-z2xrz\" (UID: \"ada9d0db-7f80-4159-9e92-7fe71d0647f6\") " pod="openshift-image-registry/node-ca-z2xrz" Apr 21 06:26:04.953294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952342 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aa39b975-a320-4be6-9871-173b44b3bf1a-system-cni-dir\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:04.953294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952358 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/aa39b975-a320-4be6-9871-173b44b3bf1a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:04.953294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952378 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-host-run-k8s-cni-cncf-io\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:04.953294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952402 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-host-var-lib-cni-multus\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:04.953294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952431 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-multus-conf-dir\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:04.953294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952454 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret\") pod \"global-pull-secret-syncer-frtp9\" (UID: \"f9a14c21-e359-4c20-95a5-948922cc3ff8\") " pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:04.953294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952478 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0814d57e-a465-4787-8668-7b52f9ae671d-socket-dir\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:04.953294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952501 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0814d57e-a465-4787-8668-7b52f9ae671d-sys-fs\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:04.953294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952539 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-kubelet\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.953294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952578 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-run-ovn-kubernetes\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.953294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952607 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-var-lib-kubelet\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:04.953294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952637 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-host-run-multus-certs\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:04.954020 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952674 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czczv\" (UniqueName: \"kubernetes.io/projected/a1478fba-f9dc-413e-8354-ffdd0bcdaed2-kube-api-access-czczv\") pod \"iptables-alerter-kwxms\" (UID: \"a1478fba-f9dc-413e-8354-ffdd0bcdaed2\") " pod="openshift-network-operator/iptables-alerter-kwxms" Apr 21 06:26:04.954020 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952712 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aa39b975-a320-4be6-9871-173b44b3bf1a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:04.954020 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952743 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-systemd-units\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.954020 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952768 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-run-ovn\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.954020 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952800 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ea78721d-4fb0-4884-9dfd-d0be9bbc750b-hosts-file\") pod \"node-resolver-8z4fb\" (UID: \"ea78721d-4fb0-4884-9dfd-d0be9bbc750b\") " pod="openshift-dns/node-resolver-8z4fb" Apr 21 06:26:04.954020 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952825 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-multus-cni-dir\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:04.954020 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952847 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-hostroot\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:04.954020 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952903 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/74dcd627-03e5-412a-b898-6f771a157832-multus-daemon-config\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:04.954020 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.952943 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-slash\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:04.987467 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.987439 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 06:21:03 +0000 UTC" deadline="2027-11-02 01:02:56.654115431 +0000 UTC" Apr 21 06:26:04.987467 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:04.987467 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13434h36m51.666651398s" Apr 21 06:26:05.042347 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.042308 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 06:26:05.042513 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.042417 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 06:26:05.053334 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053295 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/aa39b975-a320-4be6-9871-173b44b3bf1a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:05.053475 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053341 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-host-run-k8s-cni-cncf-io\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.053475 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053368 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-host-var-lib-cni-multus\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.053475 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053394 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-multus-conf-dir\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.053475 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053394 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-host-run-k8s-cni-cncf-io\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.053475 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053417 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret\") pod \"global-pull-secret-syncer-frtp9\" (UID: \"f9a14c21-e359-4c20-95a5-948922cc3ff8\") " pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:05.053475 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053441 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0814d57e-a465-4787-8668-7b52f9ae671d-socket-dir\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:05.053475 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053448 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-host-var-lib-cni-multus\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.053475 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053465 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0814d57e-a465-4787-8668-7b52f9ae671d-sys-fs\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:05.053846 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053489 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-kubelet\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.053846 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053513 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-run-ovn-kubernetes\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.053846 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053515 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-multus-conf-dir\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.053846 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053537 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-var-lib-kubelet\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.053846 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:05.053552 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:05.053846 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053562 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-host-run-multus-certs\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.053846 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053579 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-kubelet\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.053846 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053590 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czczv\" (UniqueName: \"kubernetes.io/projected/a1478fba-f9dc-413e-8354-ffdd0bcdaed2-kube-api-access-czczv\") pod \"iptables-alerter-kwxms\" (UID: \"a1478fba-f9dc-413e-8354-ffdd0bcdaed2\") " pod="openshift-network-operator/iptables-alerter-kwxms" Apr 21 06:26:05.053846 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053609 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0814d57e-a465-4787-8668-7b52f9ae671d-socket-dir\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:05.053846 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053638 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0814d57e-a465-4787-8668-7b52f9ae671d-sys-fs\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:05.053846 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:05.053643 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret podName:f9a14c21-e359-4c20-95a5-948922cc3ff8 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:05.553602186 +0000 UTC m=+3.060067097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret") pod "global-pull-secret-syncer-frtp9" (UID: "f9a14c21-e359-4c20-95a5-948922cc3ff8") : object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:05.053846 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053675 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aa39b975-a320-4be6-9871-173b44b3bf1a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:05.053846 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053687 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-var-lib-kubelet\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.053846 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053704 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-systemd-units\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.053846 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053735 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-run-ovn-kubernetes\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.053846 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053741 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-run-ovn\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.053846 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053760 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-host-run-multus-certs\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.054637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053775 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-systemd-units\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.054637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053766 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ea78721d-4fb0-4884-9dfd-d0be9bbc750b-hosts-file\") pod \"node-resolver-8z4fb\" (UID: \"ea78721d-4fb0-4884-9dfd-d0be9bbc750b\") " pod="openshift-dns/node-resolver-8z4fb" Apr 21 06:26:05.054637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053805 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-run-ovn\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.054637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053811 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-multus-cni-dir\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.054637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053816 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ea78721d-4fb0-4884-9dfd-d0be9bbc750b-hosts-file\") pod \"node-resolver-8z4fb\" (UID: \"ea78721d-4fb0-4884-9dfd-d0be9bbc750b\") " pod="openshift-dns/node-resolver-8z4fb" Apr 21 06:26:05.054637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053823 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/aa39b975-a320-4be6-9871-173b44b3bf1a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:05.054637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053846 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-hostroot\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.054637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053882 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-multus-cni-dir\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.054637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053890 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aa39b975-a320-4be6-9871-173b44b3bf1a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:05.054637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053898 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-hostroot\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.054637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053900 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/74dcd627-03e5-412a-b898-6f771a157832-multus-daemon-config\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.054637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053932 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-slash\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.054637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053956 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.054637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.053982 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74dcd627-03e5-412a-b898-6f771a157832-cni-binary-copy\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.054637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054002 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-slash\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.054637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054006 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dec8e610-c059-4e43-8e86-18a73c970319-konnectivity-ca\") pod \"konnectivity-agent-gh2h2\" (UID: \"dec8e610-c059-4e43-8e86-18a73c970319\") " pod="kube-system/konnectivity-agent-gh2h2" Apr 21 06:26:05.054637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054049 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f9a14c21-e359-4c20-95a5-948922cc3ff8-dbus\") pod \"global-pull-secret-syncer-frtp9\" (UID: \"f9a14c21-e359-4c20-95a5-948922cc3ff8\") " pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:05.054637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054057 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.055524 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054080 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs\") pod \"network-metrics-daemon-xhdsz\" (UID: \"d0022157-8720-4a4c-8cf0-324fe8cb0e3f\") " pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:05.055524 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054116 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0814d57e-a465-4787-8668-7b52f9ae671d-device-dir\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:05.055524 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054143 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-cni-netd\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.055524 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054167 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfc3125e-919d-4ff6-add5-623ba583cd1a-tmp\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.055524 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054191 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-host-var-lib-kubelet\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.055524 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054193 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0814d57e-a465-4787-8668-7b52f9ae671d-device-dir\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:05.055524 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054200 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f9a14c21-e359-4c20-95a5-948922cc3ff8-dbus\") pod \"global-pull-secret-syncer-frtp9\" (UID: \"f9a14c21-e359-4c20-95a5-948922cc3ff8\") " pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:05.055524 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054229 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-log-socket\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.055524 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054241 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-host-var-lib-kubelet\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.055524 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054260 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-kubernetes\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.055524 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054280 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-cni-netd\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.055524 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054284 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt2lf\" (UniqueName: \"kubernetes.io/projected/bfc3125e-919d-4ff6-add5-623ba583cd1a-kube-api-access-lt2lf\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.055524 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054309 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzq6l\" (UniqueName: \"kubernetes.io/projected/74dcd627-03e5-412a-b898-6f771a157832-kube-api-access-vzq6l\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.055524 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054316 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-log-socket\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.055524 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054330 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ada9d0db-7f80-4159-9e92-7fe71d0647f6-serviceca\") pod \"node-ca-z2xrz\" (UID: \"ada9d0db-7f80-4159-9e92-7fe71d0647f6\") " pod="openshift-image-registry/node-ca-z2xrz" Apr 21 06:26:05.055524 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054357 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpn76\" (UniqueName: \"kubernetes.io/projected/ada9d0db-7f80-4159-9e92-7fe71d0647f6-kube-api-access-vpn76\") pod \"node-ca-z2xrz\" (UID: \"ada9d0db-7f80-4159-9e92-7fe71d0647f6\") " pod="openshift-image-registry/node-ca-z2xrz" Apr 21 06:26:05.055524 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054381 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-run-netns\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.056316 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054404 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-tuned\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.056316 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054427 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aa39b975-a320-4be6-9871-173b44b3bf1a-cnibin\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:05.056316 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054451 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aa39b975-a320-4be6-9871-173b44b3bf1a-os-release\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:05.056316 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054460 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74dcd627-03e5-412a-b898-6f771a157832-cni-binary-copy\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.056316 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054477 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/74dcd627-03e5-412a-b898-6f771a157832-multus-daemon-config\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.056316 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054503 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dec8e610-c059-4e43-8e86-18a73c970319-konnectivity-ca\") pod \"konnectivity-agent-gh2h2\" (UID: \"dec8e610-c059-4e43-8e86-18a73c970319\") " pod="kube-system/konnectivity-agent-gh2h2" Apr 21 06:26:05.056316 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054508 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 06:26:05.056316 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054509 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-run-netns\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.056316 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054476 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4860a8de-8ebf-4c37-b025-9aaf165b999b-ovnkube-config\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.056316 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054557 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-kubernetes\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.056316 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054673 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aa39b975-a320-4be6-9871-173b44b3bf1a-os-release\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:05.056316 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054709 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aa39b975-a320-4be6-9871-173b44b3bf1a-cnibin\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:05.056316 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054740 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4860a8de-8ebf-4c37-b025-9aaf165b999b-env-overrides\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.056316 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054766 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-sysconfig\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.056316 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054789 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-sysctl-d\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.056316 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054814 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-system-cni-dir\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.056316 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054886 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-sysconfig\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.056316 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054897 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-system-cni-dir\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.057109 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054921 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a1478fba-f9dc-413e-8354-ffdd0bcdaed2-iptables-alerter-script\") pod \"iptables-alerter-kwxms\" (UID: \"a1478fba-f9dc-413e-8354-ffdd0bcdaed2\") " pod="openshift-network-operator/iptables-alerter-kwxms" Apr 21 06:26:05.057109 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054949 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mfgf\" (UniqueName: \"kubernetes.io/projected/aa39b975-a320-4be6-9871-173b44b3bf1a-kube-api-access-8mfgf\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:05.057109 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054958 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-sysctl-d\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.057109 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054970 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ada9d0db-7f80-4159-9e92-7fe71d0647f6-serviceca\") pod \"node-ca-z2xrz\" (UID: \"ada9d0db-7f80-4159-9e92-7fe71d0647f6\") " pod="openshift-image-registry/node-ca-z2xrz" Apr 21 06:26:05.057109 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.054973 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-modprobe-d\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.057109 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055017 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cknws\" (UniqueName: \"kubernetes.io/projected/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-kube-api-access-cknws\") pod \"network-metrics-daemon-xhdsz\" (UID: \"d0022157-8720-4a4c-8cf0-324fe8cb0e3f\") " pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:05.057109 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055046 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aa39b975-a320-4be6-9871-173b44b3bf1a-cni-binary-copy\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:05.057109 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055063 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-modprobe-d\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.057109 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055074 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aa39b975-a320-4be6-9871-173b44b3bf1a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:05.057109 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055103 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-run-openvswitch\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.057109 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055129 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ea78721d-4fb0-4884-9dfd-d0be9bbc750b-tmp-dir\") pod \"node-resolver-8z4fb\" (UID: \"ea78721d-4fb0-4884-9dfd-d0be9bbc750b\") " pod="openshift-dns/node-resolver-8z4fb" Apr 21 06:26:05.057109 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055156 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dec8e610-c059-4e43-8e86-18a73c970319-agent-certs\") pod \"konnectivity-agent-gh2h2\" (UID: \"dec8e610-c059-4e43-8e86-18a73c970319\") " pod="kube-system/konnectivity-agent-gh2h2" Apr 21 06:26:05.057109 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055181 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69rg7\" (UniqueName: \"kubernetes.io/projected/0814d57e-a465-4787-8668-7b52f9ae671d-kube-api-access-69rg7\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:05.057109 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055204 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4860a8de-8ebf-4c37-b025-9aaf165b999b-env-overrides\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.057109 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055262 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-host-run-netns\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.057109 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055306 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-run-openvswitch\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.057109 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055465 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a1478fba-f9dc-413e-8354-ffdd0bcdaed2-iptables-alerter-script\") pod \"iptables-alerter-kwxms\" (UID: \"a1478fba-f9dc-413e-8354-ffdd0bcdaed2\") " pod="openshift-network-operator/iptables-alerter-kwxms" Apr 21 06:26:05.057812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055210 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-host-run-netns\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.057812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055519 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f9a14c21-e359-4c20-95a5-948922cc3ff8-kubelet-config\") pod \"global-pull-secret-syncer-frtp9\" (UID: \"f9a14c21-e359-4c20-95a5-948922cc3ff8\") " pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:05.057812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055550 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-node-log\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.057812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055585 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4860a8de-8ebf-4c37-b025-9aaf165b999b-ovn-node-metrics-cert\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.057812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055592 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ea78721d-4fb0-4884-9dfd-d0be9bbc750b-tmp-dir\") pod \"node-resolver-8z4fb\" (UID: \"ea78721d-4fb0-4884-9dfd-d0be9bbc750b\") " pod="openshift-dns/node-resolver-8z4fb" Apr 21 06:26:05.057812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055609 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nqk5\" (UniqueName: \"kubernetes.io/projected/4860a8de-8ebf-4c37-b025-9aaf165b999b-kube-api-access-8nqk5\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.057812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055652 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-lib-modules\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.057812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055680 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-multus-socket-dir-parent\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.057812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055688 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4860a8de-8ebf-4c37-b025-9aaf165b999b-ovnkube-config\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.057812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055695 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aa39b975-a320-4be6-9871-173b44b3bf1a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:05.057812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055706 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-etc-kubernetes\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.057812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055520 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aa39b975-a320-4be6-9871-173b44b3bf1a-cni-binary-copy\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:05.057812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055743 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-etc-kubernetes\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.057812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055744 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1478fba-f9dc-413e-8354-ffdd0bcdaed2-host-slash\") pod \"iptables-alerter-kwxms\" (UID: \"a1478fba-f9dc-413e-8354-ffdd0bcdaed2\") " pod="openshift-network-operator/iptables-alerter-kwxms" Apr 21 06:26:05.057812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055772 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-node-log\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.057812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055779 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0814d57e-a465-4787-8668-7b52f9ae671d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:05.057812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055784 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1478fba-f9dc-413e-8354-ffdd0bcdaed2-host-slash\") pod \"iptables-alerter-kwxms\" (UID: \"a1478fba-f9dc-413e-8354-ffdd0bcdaed2\") " pod="openshift-network-operator/iptables-alerter-kwxms" Apr 21 06:26:05.058503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055794 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-lib-modules\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.058503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055806 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0814d57e-a465-4787-8668-7b52f9ae671d-registration-dir\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:05.058503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055833 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0814d57e-a465-4787-8668-7b52f9ae671d-etc-selinux\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:05.058503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055837 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f9a14c21-e359-4c20-95a5-948922cc3ff8-kubelet-config\") pod \"global-pull-secret-syncer-frtp9\" (UID: \"f9a14c21-e359-4c20-95a5-948922cc3ff8\") " pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:05.058503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055849 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0814d57e-a465-4787-8668-7b52f9ae671d-registration-dir\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:05.058503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055872 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0814d57e-a465-4787-8668-7b52f9ae671d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:05.058503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055876 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-run-systemd\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.058503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055906 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-multus-socket-dir-parent\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.058503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055909 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-run-systemd\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.058503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055912 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-var-lib-openvswitch\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.058503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055943 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-var-lib-openvswitch\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.058503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055947 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-run\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.058503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055956 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0814d57e-a465-4787-8668-7b52f9ae671d-etc-selinux\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:05.058503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055977 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-sys\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.058503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.055987 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-run\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.058503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056002 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-os-release\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.058503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056025 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-cni-bin\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.059238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056031 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-sys\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.059238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056049 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4860a8de-8ebf-4c37-b025-9aaf165b999b-ovnkube-script-lib\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.059238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056076 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsfgk\" (UniqueName: \"kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk\") pod \"network-check-target-pfzbp\" (UID: \"aba5693c-c88c-45ce-9751-0d5e014097eb\") " pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:05.059238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056090 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-os-release\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.059238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056088 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-host-cni-bin\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.059238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056100 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-sysctl-conf\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.059238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056135 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-systemd\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.059238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056158 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-host\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.059238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056183 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kwnk\" (UniqueName: \"kubernetes.io/projected/ea78721d-4fb0-4884-9dfd-d0be9bbc750b-kube-api-access-6kwnk\") pod \"node-resolver-8z4fb\" (UID: \"ea78721d-4fb0-4884-9dfd-d0be9bbc750b\") " pod="openshift-dns/node-resolver-8z4fb" Apr 21 06:26:05.059238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056201 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-sysctl-conf\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.059238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056209 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-cnibin\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.059238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056241 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-etc-openvswitch\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.059238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056267 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-host-var-lib-cni-bin\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.059238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056292 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ada9d0db-7f80-4159-9e92-7fe71d0647f6-host\") pod \"node-ca-z2xrz\" (UID: \"ada9d0db-7f80-4159-9e92-7fe71d0647f6\") " pod="openshift-image-registry/node-ca-z2xrz" Apr 21 06:26:05.059238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056320 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aa39b975-a320-4be6-9871-173b44b3bf1a-system-cni-dir\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:05.059238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056346 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4860a8de-8ebf-4c37-b025-9aaf165b999b-etc-openvswitch\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.059238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056393 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-host-var-lib-cni-bin\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.059238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056396 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aa39b975-a320-4be6-9871-173b44b3bf1a-system-cni-dir\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:05.059973 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056440 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-systemd\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.059973 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056442 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfc3125e-919d-4ff6-add5-623ba583cd1a-host\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.059973 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056452 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74dcd627-03e5-412a-b898-6f771a157832-cnibin\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.059973 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056480 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ada9d0db-7f80-4159-9e92-7fe71d0647f6-host\") pod \"node-ca-z2xrz\" (UID: \"ada9d0db-7f80-4159-9e92-7fe71d0647f6\") " pod="openshift-image-registry/node-ca-z2xrz" Apr 21 06:26:05.059973 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.056554 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4860a8de-8ebf-4c37-b025-9aaf165b999b-ovnkube-script-lib\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.059973 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.057944 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bfc3125e-919d-4ff6-add5-623ba583cd1a-etc-tuned\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.059973 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.058191 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4860a8de-8ebf-4c37-b025-9aaf165b999b-ovn-node-metrics-cert\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.059973 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.058209 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dec8e610-c059-4e43-8e86-18a73c970319-agent-certs\") pod \"konnectivity-agent-gh2h2\" (UID: \"dec8e610-c059-4e43-8e86-18a73c970319\") " pod="kube-system/konnectivity-agent-gh2h2" Apr 21 06:26:05.059973 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.058895 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfc3125e-919d-4ff6-add5-623ba583cd1a-tmp\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.066871 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:05.065721 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:05.066871 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:05.065745 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:05.066871 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:05.065758 2570 projected.go:194] Error preparing data for projected volume kube-api-access-fsfgk for pod openshift-network-diagnostics/network-check-target-pfzbp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:05.066871 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:05.065819 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk podName:aba5693c-c88c-45ce-9751-0d5e014097eb nodeName:}" failed. No retries permitted until 2026-04-21 06:26:05.565802359 +0000 UTC m=+3.072267251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fsfgk" (UniqueName: "kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk") pod "network-check-target-pfzbp" (UID: "aba5693c-c88c-45ce-9751-0d5e014097eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:05.067145 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.066990 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czczv\" (UniqueName: \"kubernetes.io/projected/a1478fba-f9dc-413e-8354-ffdd0bcdaed2-kube-api-access-czczv\") pod \"iptables-alerter-kwxms\" (UID: \"a1478fba-f9dc-413e-8354-ffdd0bcdaed2\") " pod="openshift-network-operator/iptables-alerter-kwxms" Apr 21 06:26:05.067767 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.067594 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt2lf\" (UniqueName: \"kubernetes.io/projected/bfc3125e-919d-4ff6-add5-623ba583cd1a-kube-api-access-lt2lf\") pod \"tuned-7fmjt\" (UID: \"bfc3125e-919d-4ff6-add5-623ba583cd1a\") " pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.068237 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.068214 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpn76\" (UniqueName: \"kubernetes.io/projected/ada9d0db-7f80-4159-9e92-7fe71d0647f6-kube-api-access-vpn76\") pod \"node-ca-z2xrz\" (UID: \"ada9d0db-7f80-4159-9e92-7fe71d0647f6\") " pod="openshift-image-registry/node-ca-z2xrz" Apr 21 06:26:05.068846 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.068794 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nqk5\" (UniqueName: \"kubernetes.io/projected/4860a8de-8ebf-4c37-b025-9aaf165b999b-kube-api-access-8nqk5\") pod \"ovnkube-node-78qjr\" (UID: \"4860a8de-8ebf-4c37-b025-9aaf165b999b\") " pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.068966 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.068932 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kwnk\" (UniqueName: \"kubernetes.io/projected/ea78721d-4fb0-4884-9dfd-d0be9bbc750b-kube-api-access-6kwnk\") pod \"node-resolver-8z4fb\" (UID: \"ea78721d-4fb0-4884-9dfd-d0be9bbc750b\") " pod="openshift-dns/node-resolver-8z4fb" Apr 21 06:26:05.069551 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.069511 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzq6l\" (UniqueName: \"kubernetes.io/projected/74dcd627-03e5-412a-b898-6f771a157832-kube-api-access-vzq6l\") pod \"multus-bngnm\" (UID: \"74dcd627-03e5-412a-b898-6f771a157832\") " pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.069656 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.069622 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69rg7\" (UniqueName: \"kubernetes.io/projected/0814d57e-a465-4787-8668-7b52f9ae671d-kube-api-access-69rg7\") pod \"aws-ebs-csi-driver-node-w9lpz\" (UID: \"0814d57e-a465-4787-8668-7b52f9ae671d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:05.069954 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.069934 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mfgf\" (UniqueName: \"kubernetes.io/projected/aa39b975-a320-4be6-9871-173b44b3bf1a-kube-api-access-8mfgf\") pod \"multus-additional-cni-plugins-9kgs6\" (UID: \"aa39b975-a320-4be6-9871-173b44b3bf1a\") " pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:05.157438 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.157404 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs\") pod \"network-metrics-daemon-xhdsz\" (UID: \"d0022157-8720-4a4c-8cf0-324fe8cb0e3f\") " pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:05.157621 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.157460 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cknws\" (UniqueName: \"kubernetes.io/projected/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-kube-api-access-cknws\") pod \"network-metrics-daemon-xhdsz\" (UID: \"d0022157-8720-4a4c-8cf0-324fe8cb0e3f\") " pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:05.157621 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:05.157560 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:05.157706 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:05.157627 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs podName:d0022157-8720-4a4c-8cf0-324fe8cb0e3f nodeName:}" failed. No retries permitted until 2026-04-21 06:26:05.657608213 +0000 UTC m=+3.164073112 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs") pod "network-metrics-daemon-xhdsz" (UID: "d0022157-8720-4a4c-8cf0-324fe8cb0e3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:05.167841 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.167807 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cknws\" (UniqueName: \"kubernetes.io/projected/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-kube-api-access-cknws\") pod \"network-metrics-daemon-xhdsz\" (UID: \"d0022157-8720-4a4c-8cf0-324fe8cb0e3f\") " pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:05.245812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.245730 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kwxms" Apr 21 06:26:05.253610 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.253581 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" Apr 21 06:26:05.262422 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.262398 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" Apr 21 06:26:05.268078 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.268059 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8z4fb" Apr 21 06:26:05.275703 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.275687 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bngnm" Apr 21 06:26:05.281216 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.281196 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9kgs6" Apr 21 06:26:05.287736 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.287715 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gh2h2" Apr 21 06:26:05.294340 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.294318 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:05.298940 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.298918 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-z2xrz" Apr 21 06:26:05.560534 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.560459 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret\") pod \"global-pull-secret-syncer-frtp9\" (UID: \"f9a14c21-e359-4c20-95a5-948922cc3ff8\") " pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:05.560692 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:05.560601 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:05.560692 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:05.560655 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret podName:f9a14c21-e359-4c20-95a5-948922cc3ff8 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:06.560637513 +0000 UTC m=+4.067102411 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret") pod "global-pull-secret-syncer-frtp9" (UID: "f9a14c21-e359-4c20-95a5-948922cc3ff8") : object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:05.615974 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:05.612844 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74dcd627_03e5_412a_b898_6f771a157832.slice/crio-2819a326c0122a4a634daa4959aa23778676997784eb3495aa46a80362ce2fc6 WatchSource:0}: Error finding container 2819a326c0122a4a634daa4959aa23778676997784eb3495aa46a80362ce2fc6: Status 404 returned error can't find the container with id 2819a326c0122a4a634daa4959aa23778676997784eb3495aa46a80362ce2fc6 Apr 21 06:26:05.618332 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:05.618293 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4860a8de_8ebf_4c37_b025_9aaf165b999b.slice/crio-f1e38fbe6ee62cf76656e0e41f8cf34eb70265e125fa1b1c5b54783fc8da9880 WatchSource:0}: Error finding container f1e38fbe6ee62cf76656e0e41f8cf34eb70265e125fa1b1c5b54783fc8da9880: Status 404 returned error can't find the container with id f1e38fbe6ee62cf76656e0e41f8cf34eb70265e125fa1b1c5b54783fc8da9880 Apr 21 06:26:05.619627 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:05.619559 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podada9d0db_7f80_4159_9e92_7fe71d0647f6.slice/crio-94f4eeb0073635d4db299da38bcdda66abb221b2af9d4afa2dcfcab94fc36d9b WatchSource:0}: Error finding container 94f4eeb0073635d4db299da38bcdda66abb221b2af9d4afa2dcfcab94fc36d9b: Status 404 returned error can't find the container with id 94f4eeb0073635d4db299da38bcdda66abb221b2af9d4afa2dcfcab94fc36d9b Apr 21 06:26:05.621088 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:05.621066 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfc3125e_919d_4ff6_add5_623ba583cd1a.slice/crio-8b394a131cf99d2b31023be2ba4f38b6b12b22149c1830a516bbf3f736086ac5 WatchSource:0}: Error finding container 8b394a131cf99d2b31023be2ba4f38b6b12b22149c1830a516bbf3f736086ac5: Status 404 returned error can't find the container with id 8b394a131cf99d2b31023be2ba4f38b6b12b22149c1830a516bbf3f736086ac5 Apr 21 06:26:05.621635 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:05.621616 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea78721d_4fb0_4884_9dfd_d0be9bbc750b.slice/crio-854aeac667f973aa174342e9f14d9cd4fe10fb7495a97b9124b1dfc6646141e7 WatchSource:0}: Error finding container 854aeac667f973aa174342e9f14d9cd4fe10fb7495a97b9124b1dfc6646141e7: Status 404 returned error can't find the container with id 854aeac667f973aa174342e9f14d9cd4fe10fb7495a97b9124b1dfc6646141e7 Apr 21 06:26:05.622680 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:05.622611 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0814d57e_a465_4787_8668_7b52f9ae671d.slice/crio-b6c2e6e765ae45dd0c8e35ea379fe0f74a97bc717a829be64e4f799150ed8c01 WatchSource:0}: Error finding container b6c2e6e765ae45dd0c8e35ea379fe0f74a97bc717a829be64e4f799150ed8c01: Status 404 returned error can't find the container with id b6c2e6e765ae45dd0c8e35ea379fe0f74a97bc717a829be64e4f799150ed8c01 Apr 21 06:26:05.626239 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:05.625611 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddec8e610_c059_4e43_8e86_18a73c970319.slice/crio-d08bba2e0d870efa8e72f27083ebc6c371bc98aa5a23d34bb0f727d80b727106 WatchSource:0}: Error finding container d08bba2e0d870efa8e72f27083ebc6c371bc98aa5a23d34bb0f727d80b727106: Status 404 returned error can't find the container with id d08bba2e0d870efa8e72f27083ebc6c371bc98aa5a23d34bb0f727d80b727106 Apr 21 06:26:05.661192 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.661043 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs\") pod \"network-metrics-daemon-xhdsz\" (UID: \"d0022157-8720-4a4c-8cf0-324fe8cb0e3f\") " pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:05.661275 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.661230 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsfgk\" (UniqueName: \"kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk\") pod \"network-check-target-pfzbp\" (UID: \"aba5693c-c88c-45ce-9751-0d5e014097eb\") " pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:05.661275 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:05.661151 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:05.661346 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:05.661309 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs podName:d0022157-8720-4a4c-8cf0-324fe8cb0e3f nodeName:}" failed. No retries permitted until 2026-04-21 06:26:06.661295335 +0000 UTC m=+4.167760226 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs") pod "network-metrics-daemon-xhdsz" (UID: "d0022157-8720-4a4c-8cf0-324fe8cb0e3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:05.661346 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:05.661318 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:05.661346 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:05.661331 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:05.661346 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:05.661339 2570 projected.go:194] Error preparing data for projected volume kube-api-access-fsfgk for pod openshift-network-diagnostics/network-check-target-pfzbp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:05.661479 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:05.661374 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk podName:aba5693c-c88c-45ce-9751-0d5e014097eb nodeName:}" failed. No retries permitted until 2026-04-21 06:26:06.661360213 +0000 UTC m=+4.167825103 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fsfgk" (UniqueName: "kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk") pod "network-check-target-pfzbp" (UID: "aba5693c-c88c-45ce-9751-0d5e014097eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:05.990233 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.988519 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 06:21:03 +0000 UTC" deadline="2027-10-25 13:02:45.86417642 +0000 UTC" Apr 21 06:26:05.990233 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:05.988555 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13254h36m39.875625171s" Apr 21 06:26:06.078916 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:06.078886 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:06.079080 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:06.079017 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frtp9" podUID="f9a14c21-e359-4c20-95a5-948922cc3ff8" Apr 21 06:26:06.098142 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:06.098102 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-z2xrz" event={"ID":"ada9d0db-7f80-4159-9e92-7fe71d0647f6","Type":"ContainerStarted","Data":"94f4eeb0073635d4db299da38bcdda66abb221b2af9d4afa2dcfcab94fc36d9b"} Apr 21 06:26:06.107556 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:06.107509 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" event={"ID":"bfc3125e-919d-4ff6-add5-623ba583cd1a","Type":"ContainerStarted","Data":"8b394a131cf99d2b31023be2ba4f38b6b12b22149c1830a516bbf3f736086ac5"} Apr 21 06:26:06.114427 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:06.114391 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bngnm" event={"ID":"74dcd627-03e5-412a-b898-6f771a157832","Type":"ContainerStarted","Data":"2819a326c0122a4a634daa4959aa23778676997784eb3495aa46a80362ce2fc6"} Apr 21 06:26:06.117079 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:06.117052 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kwxms" event={"ID":"a1478fba-f9dc-413e-8354-ffdd0bcdaed2","Type":"ContainerStarted","Data":"e3014e6e367d971d2018dfe9e4236688547fcaecbca988a04d1acdc1d1941d61"} Apr 21 06:26:06.126079 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:06.126044 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8z4fb" event={"ID":"ea78721d-4fb0-4884-9dfd-d0be9bbc750b","Type":"ContainerStarted","Data":"854aeac667f973aa174342e9f14d9cd4fe10fb7495a97b9124b1dfc6646141e7"} Apr 21 06:26:06.139783 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:06.139755 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9kgs6" event={"ID":"aa39b975-a320-4be6-9871-173b44b3bf1a","Type":"ContainerStarted","Data":"a1763700c3ab24410b01db8deba429f18e4f46db6166385ed42a1536f807fde6"} Apr 21 06:26:06.147378 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:06.147351 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal" event={"ID":"9475ce23d467a37e0480df7597bbc574","Type":"ContainerStarted","Data":"9d889887c81c88b41617426e15f9168e6f918b7df481816f1cdc4f494fe47ccb"} Apr 21 06:26:06.155407 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:06.155377 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gh2h2" event={"ID":"dec8e610-c059-4e43-8e86-18a73c970319","Type":"ContainerStarted","Data":"d08bba2e0d870efa8e72f27083ebc6c371bc98aa5a23d34bb0f727d80b727106"} Apr 21 06:26:06.160060 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:06.159788 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" event={"ID":"4860a8de-8ebf-4c37-b025-9aaf165b999b","Type":"ContainerStarted","Data":"f1e38fbe6ee62cf76656e0e41f8cf34eb70265e125fa1b1c5b54783fc8da9880"} Apr 21 06:26:06.167832 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:06.167787 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" event={"ID":"0814d57e-a465-4787-8668-7b52f9ae671d","Type":"ContainerStarted","Data":"b6c2e6e765ae45dd0c8e35ea379fe0f74a97bc717a829be64e4f799150ed8c01"} Apr 21 06:26:06.569009 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:06.568978 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret\") pod \"global-pull-secret-syncer-frtp9\" (UID: \"f9a14c21-e359-4c20-95a5-948922cc3ff8\") " pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:06.569606 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:06.569171 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:06.569606 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:06.569241 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret podName:f9a14c21-e359-4c20-95a5-948922cc3ff8 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:08.569223603 +0000 UTC m=+6.075688498 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret") pod "global-pull-secret-syncer-frtp9" (UID: "f9a14c21-e359-4c20-95a5-948922cc3ff8") : object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:06.669479 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:06.669449 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsfgk\" (UniqueName: \"kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk\") pod \"network-check-target-pfzbp\" (UID: \"aba5693c-c88c-45ce-9751-0d5e014097eb\") " pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:06.669610 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:06.669525 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs\") pod \"network-metrics-daemon-xhdsz\" (UID: \"d0022157-8720-4a4c-8cf0-324fe8cb0e3f\") " pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:06.669668 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:06.669641 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:06.669726 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:06.669696 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs podName:d0022157-8720-4a4c-8cf0-324fe8cb0e3f nodeName:}" failed. No retries permitted until 2026-04-21 06:26:08.669676197 +0000 UTC m=+6.176141093 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs") pod "network-metrics-daemon-xhdsz" (UID: "d0022157-8720-4a4c-8cf0-324fe8cb0e3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:06.670086 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:06.670070 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:06.670149 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:06.670092 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:06.670149 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:06.670105 2570 projected.go:194] Error preparing data for projected volume kube-api-access-fsfgk for pod openshift-network-diagnostics/network-check-target-pfzbp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:06.670149 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:06.670147 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk podName:aba5693c-c88c-45ce-9751-0d5e014097eb nodeName:}" failed. No retries permitted until 2026-04-21 06:26:08.670131976 +0000 UTC m=+6.176596869 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fsfgk" (UniqueName: "kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk") pod "network-check-target-pfzbp" (UID: "aba5693c-c88c-45ce-9751-0d5e014097eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:07.079045 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:07.079014 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:07.079498 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:07.079020 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:07.079498 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:07.079153 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhdsz" podUID="d0022157-8720-4a4c-8cf0-324fe8cb0e3f" Apr 21 06:26:07.079498 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:07.079180 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pfzbp" podUID="aba5693c-c88c-45ce-9751-0d5e014097eb" Apr 21 06:26:07.189919 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:07.189881 2570 generic.go:358] "Generic (PLEG): container finished" podID="25dc2f6e99d2525192843ee005a28c4f" containerID="4d3d383ee2171fa40db4da8b73abe6bdedee32bd3daa41f09b38d7cf7bc240b5" exitCode=0 Apr 21 06:26:07.190102 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:07.189950 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" event={"ID":"25dc2f6e99d2525192843ee005a28c4f","Type":"ContainerDied","Data":"4d3d383ee2171fa40db4da8b73abe6bdedee32bd3daa41f09b38d7cf7bc240b5"} Apr 21 06:26:07.212563 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:07.212510 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal" podStartSLOduration=3.212492426 podStartE2EDuration="3.212492426s" podCreationTimestamp="2026-04-21 06:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:26:06.164538302 +0000 UTC m=+3.671003216" watchObservedRunningTime="2026-04-21 06:26:07.212492426 +0000 UTC m=+4.718957343" Apr 21 06:26:08.077968 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:08.077927 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:08.078151 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:08.078057 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frtp9" podUID="f9a14c21-e359-4c20-95a5-948922cc3ff8" Apr 21 06:26:08.196110 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:08.196075 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" event={"ID":"25dc2f6e99d2525192843ee005a28c4f","Type":"ContainerStarted","Data":"280e7e1dc02509a1ed63973c7cee7a99bfca577c10991b72296e660ea6218122"} Apr 21 06:26:08.586961 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:08.586927 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret\") pod \"global-pull-secret-syncer-frtp9\" (UID: \"f9a14c21-e359-4c20-95a5-948922cc3ff8\") " pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:08.587159 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:08.587086 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:08.587228 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:08.587159 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret podName:f9a14c21-e359-4c20-95a5-948922cc3ff8 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:12.587139765 +0000 UTC m=+10.093604658 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret") pod "global-pull-secret-syncer-frtp9" (UID: "f9a14c21-e359-4c20-95a5-948922cc3ff8") : object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:08.688049 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:08.688005 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs\") pod \"network-metrics-daemon-xhdsz\" (UID: \"d0022157-8720-4a4c-8cf0-324fe8cb0e3f\") " pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:08.688193 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:08.688091 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsfgk\" (UniqueName: \"kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk\") pod \"network-check-target-pfzbp\" (UID: \"aba5693c-c88c-45ce-9751-0d5e014097eb\") " pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:08.688267 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:08.688249 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:08.688318 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:08.688269 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:08.688318 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:08.688282 2570 projected.go:194] Error preparing data for projected volume kube-api-access-fsfgk for pod openshift-network-diagnostics/network-check-target-pfzbp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:08.688420 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:08.688341 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk podName:aba5693c-c88c-45ce-9751-0d5e014097eb nodeName:}" failed. No retries permitted until 2026-04-21 06:26:12.688321649 +0000 UTC m=+10.194786554 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fsfgk" (UniqueName: "kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk") pod "network-check-target-pfzbp" (UID: "aba5693c-c88c-45ce-9751-0d5e014097eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:08.688685 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:08.688600 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:08.688685 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:08.688664 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs podName:d0022157-8720-4a4c-8cf0-324fe8cb0e3f nodeName:}" failed. No retries permitted until 2026-04-21 06:26:12.688647137 +0000 UTC m=+10.195112047 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs") pod "network-metrics-daemon-xhdsz" (UID: "d0022157-8720-4a4c-8cf0-324fe8cb0e3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:09.082358 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:09.081844 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:09.082358 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:09.081979 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pfzbp" podUID="aba5693c-c88c-45ce-9751-0d5e014097eb" Apr 21 06:26:09.082358 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:09.082007 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:09.082358 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:09.082115 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhdsz" podUID="d0022157-8720-4a4c-8cf0-324fe8cb0e3f" Apr 21 06:26:10.078551 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:10.078519 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:10.079028 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:10.078657 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frtp9" podUID="f9a14c21-e359-4c20-95a5-948922cc3ff8" Apr 21 06:26:11.079521 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:11.078905 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:11.079521 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:11.079075 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:11.079521 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:11.079175 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhdsz" podUID="d0022157-8720-4a4c-8cf0-324fe8cb0e3f" Apr 21 06:26:11.079521 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:11.079455 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pfzbp" podUID="aba5693c-c88c-45ce-9751-0d5e014097eb" Apr 21 06:26:12.078313 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:12.078276 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:12.078483 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:12.078415 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frtp9" podUID="f9a14c21-e359-4c20-95a5-948922cc3ff8" Apr 21 06:26:12.623637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:12.623603 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret\") pod \"global-pull-secret-syncer-frtp9\" (UID: \"f9a14c21-e359-4c20-95a5-948922cc3ff8\") " pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:12.624096 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:12.623733 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:12.624096 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:12.623796 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret podName:f9a14c21-e359-4c20-95a5-948922cc3ff8 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:20.623778283 +0000 UTC m=+18.130243177 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret") pod "global-pull-secret-syncer-frtp9" (UID: "f9a14c21-e359-4c20-95a5-948922cc3ff8") : object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:12.724258 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:12.724198 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsfgk\" (UniqueName: \"kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk\") pod \"network-check-target-pfzbp\" (UID: \"aba5693c-c88c-45ce-9751-0d5e014097eb\") " pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:12.724403 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:12.724272 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs\") pod \"network-metrics-daemon-xhdsz\" (UID: \"d0022157-8720-4a4c-8cf0-324fe8cb0e3f\") " pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:12.724476 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:12.724406 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:12.724476 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:12.724468 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs podName:d0022157-8720-4a4c-8cf0-324fe8cb0e3f nodeName:}" failed. No retries permitted until 2026-04-21 06:26:20.724446824 +0000 UTC m=+18.230911729 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs") pod "network-metrics-daemon-xhdsz" (UID: "d0022157-8720-4a4c-8cf0-324fe8cb0e3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:12.724811 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:12.724709 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:12.724811 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:12.724739 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:12.724811 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:12.724752 2570 projected.go:194] Error preparing data for projected volume kube-api-access-fsfgk for pod openshift-network-diagnostics/network-check-target-pfzbp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:12.724811 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:12.724798 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk podName:aba5693c-c88c-45ce-9751-0d5e014097eb nodeName:}" failed. No retries permitted until 2026-04-21 06:26:20.724783579 +0000 UTC m=+18.231248486 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fsfgk" (UniqueName: "kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk") pod "network-check-target-pfzbp" (UID: "aba5693c-c88c-45ce-9751-0d5e014097eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:13.082294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:13.081797 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:13.082294 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:13.081927 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pfzbp" podUID="aba5693c-c88c-45ce-9751-0d5e014097eb" Apr 21 06:26:13.082294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:13.082014 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:13.082294 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:13.082105 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhdsz" podUID="d0022157-8720-4a4c-8cf0-324fe8cb0e3f" Apr 21 06:26:14.078373 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:14.078321 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:14.078774 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:14.078500 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frtp9" podUID="f9a14c21-e359-4c20-95a5-948922cc3ff8" Apr 21 06:26:15.080797 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:15.080768 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:15.081248 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:15.080768 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:15.081248 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:15.080897 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pfzbp" podUID="aba5693c-c88c-45ce-9751-0d5e014097eb" Apr 21 06:26:15.081248 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:15.080953 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhdsz" podUID="d0022157-8720-4a4c-8cf0-324fe8cb0e3f" Apr 21 06:26:16.078421 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:16.078386 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:16.078585 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:16.078483 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frtp9" podUID="f9a14c21-e359-4c20-95a5-948922cc3ff8" Apr 21 06:26:17.077967 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:17.077928 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:17.077967 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:17.077963 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:17.078494 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:17.078080 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhdsz" podUID="d0022157-8720-4a4c-8cf0-324fe8cb0e3f" Apr 21 06:26:17.078494 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:17.078214 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pfzbp" podUID="aba5693c-c88c-45ce-9751-0d5e014097eb" Apr 21 06:26:18.078638 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:18.078559 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:18.079066 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:18.078681 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frtp9" podUID="f9a14c21-e359-4c20-95a5-948922cc3ff8" Apr 21 06:26:19.081331 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:19.081302 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:19.081767 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:19.081302 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:19.081767 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:19.081434 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhdsz" podUID="d0022157-8720-4a4c-8cf0-324fe8cb0e3f" Apr 21 06:26:19.081767 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:19.081508 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pfzbp" podUID="aba5693c-c88c-45ce-9751-0d5e014097eb" Apr 21 06:26:20.078902 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:20.078845 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:20.079079 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:20.078999 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frtp9" podUID="f9a14c21-e359-4c20-95a5-948922cc3ff8" Apr 21 06:26:20.688424 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:20.688388 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret\") pod \"global-pull-secret-syncer-frtp9\" (UID: \"f9a14c21-e359-4c20-95a5-948922cc3ff8\") " pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:20.688806 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:20.688560 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:20.688806 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:20.688635 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret podName:f9a14c21-e359-4c20-95a5-948922cc3ff8 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:36.688615556 +0000 UTC m=+34.195080455 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret") pod "global-pull-secret-syncer-frtp9" (UID: "f9a14c21-e359-4c20-95a5-948922cc3ff8") : object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:20.789259 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:20.789227 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs\") pod \"network-metrics-daemon-xhdsz\" (UID: \"d0022157-8720-4a4c-8cf0-324fe8cb0e3f\") " pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:20.789446 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:20.789289 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsfgk\" (UniqueName: \"kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk\") pod \"network-check-target-pfzbp\" (UID: \"aba5693c-c88c-45ce-9751-0d5e014097eb\") " pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:20.789446 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:20.789408 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:20.789446 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:20.789423 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:20.789446 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:20.789437 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:20.789446 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:20.789446 2570 projected.go:194] Error preparing data for projected volume kube-api-access-fsfgk for pod openshift-network-diagnostics/network-check-target-pfzbp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:20.789704 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:20.789482 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs podName:d0022157-8720-4a4c-8cf0-324fe8cb0e3f nodeName:}" failed. No retries permitted until 2026-04-21 06:26:36.789462265 +0000 UTC m=+34.295927158 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs") pod "network-metrics-daemon-xhdsz" (UID: "d0022157-8720-4a4c-8cf0-324fe8cb0e3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:20.789704 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:20.789497 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk podName:aba5693c-c88c-45ce-9751-0d5e014097eb nodeName:}" failed. No retries permitted until 2026-04-21 06:26:36.789490904 +0000 UTC m=+34.295955794 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fsfgk" (UniqueName: "kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk") pod "network-check-target-pfzbp" (UID: "aba5693c-c88c-45ce-9751-0d5e014097eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:21.078819 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:21.078732 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:21.078819 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:21.078760 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:21.079047 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:21.078894 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhdsz" podUID="d0022157-8720-4a4c-8cf0-324fe8cb0e3f" Apr 21 06:26:21.079047 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:21.079026 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pfzbp" podUID="aba5693c-c88c-45ce-9751-0d5e014097eb" Apr 21 06:26:22.078227 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:22.078199 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:22.078550 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:22.078290 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frtp9" podUID="f9a14c21-e359-4c20-95a5-948922cc3ff8" Apr 21 06:26:23.079705 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.079397 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:23.080305 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.079482 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:23.080305 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:23.079795 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pfzbp" podUID="aba5693c-c88c-45ce-9751-0d5e014097eb" Apr 21 06:26:23.080305 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:23.079916 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhdsz" podUID="d0022157-8720-4a4c-8cf0-324fe8cb0e3f" Apr 21 06:26:23.222103 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.222073 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gh2h2" event={"ID":"dec8e610-c059-4e43-8e86-18a73c970319","Type":"ContainerStarted","Data":"7ac8501a56157045c60abc6a05c8168c6e5655c160a95ce889cab330ccfd0230"} Apr 21 06:26:23.224518 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.224495 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78qjr_4860a8de-8ebf-4c37-b025-9aaf165b999b/ovn-acl-logging/0.log" Apr 21 06:26:23.224968 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.224938 2570 generic.go:358] "Generic (PLEG): container finished" podID="4860a8de-8ebf-4c37-b025-9aaf165b999b" containerID="eb33843846e7557584a71296d04038cf29b1d08d6f885cd4e8c12bad4450f324" exitCode=1 Apr 21 06:26:23.225083 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.224996 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" event={"ID":"4860a8de-8ebf-4c37-b025-9aaf165b999b","Type":"ContainerStarted","Data":"7ecdccac9a387bb30c9dc5be86c89354678abb10951c4fc559f2da2ea68ba830"} Apr 21 06:26:23.225083 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.225026 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" event={"ID":"4860a8de-8ebf-4c37-b025-9aaf165b999b","Type":"ContainerStarted","Data":"261471f9cde9857c07e59c2de0da0df82be0dd4a2667c9af7c7e51d27ab58235"} Apr 21 06:26:23.225083 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.225040 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" event={"ID":"4860a8de-8ebf-4c37-b025-9aaf165b999b","Type":"ContainerStarted","Data":"017a2a3952f000c4cbbb8df45d2bf3d5cb37e869a5a72bf84f0680486e5638de"} Apr 21 06:26:23.225083 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.225051 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" event={"ID":"4860a8de-8ebf-4c37-b025-9aaf165b999b","Type":"ContainerDied","Data":"eb33843846e7557584a71296d04038cf29b1d08d6f885cd4e8c12bad4450f324"} Apr 21 06:26:23.225083 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.225065 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" event={"ID":"4860a8de-8ebf-4c37-b025-9aaf165b999b","Type":"ContainerStarted","Data":"cdcf31f2a26f538bbae3f7bc8f146c1c4d9ad8cb36ee7638db40dabe3924fc08"} Apr 21 06:26:23.226757 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.226733 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" event={"ID":"0814d57e-a465-4787-8668-7b52f9ae671d","Type":"ContainerStarted","Data":"92adddcabf0249c285c6eddec0817167642ba31a7f7c59b4f17191a239edc6da"} Apr 21 06:26:23.228595 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.228387 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-z2xrz" event={"ID":"ada9d0db-7f80-4159-9e92-7fe71d0647f6","Type":"ContainerStarted","Data":"599b4d2415d5900d0fa7548e3371d9819544e625014bf31cca522e39627d0f54"} Apr 21 06:26:23.233044 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.233010 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" event={"ID":"bfc3125e-919d-4ff6-add5-623ba583cd1a","Type":"ContainerStarted","Data":"aea874f12e5421ccd7011a5d7235f7b60569a967d0605218439efa4d5bf4b741"} Apr 21 06:26:23.234313 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.234292 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bngnm" event={"ID":"74dcd627-03e5-412a-b898-6f771a157832","Type":"ContainerStarted","Data":"a2a6feced9d4cc61f4cd219b84c664958b35827aa91a6103aa6f29f8fc537ea2"} Apr 21 06:26:23.234585 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.234542 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" podStartSLOduration=19.23450951 podStartE2EDuration="19.23450951s" podCreationTimestamp="2026-04-21 06:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:26:08.210425508 +0000 UTC m=+5.716890421" watchObservedRunningTime="2026-04-21 06:26:23.23450951 +0000 UTC m=+20.740974420" Apr 21 06:26:23.235391 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.235346 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-gh2h2" podStartSLOduration=3.480548222 podStartE2EDuration="20.235337836s" podCreationTimestamp="2026-04-21 06:26:03 +0000 UTC" firstStartedPulling="2026-04-21 06:26:05.628759878 +0000 UTC m=+3.135224784" lastFinishedPulling="2026-04-21 06:26:22.383549502 +0000 UTC m=+19.890014398" observedRunningTime="2026-04-21 06:26:23.23474863 +0000 UTC m=+20.741213535" watchObservedRunningTime="2026-04-21 06:26:23.235337836 +0000 UTC m=+20.741802749" Apr 21 06:26:23.235794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.235772 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8z4fb" event={"ID":"ea78721d-4fb0-4884-9dfd-d0be9bbc750b","Type":"ContainerStarted","Data":"febef30016327aaef49b878c7feb8a94ea8a5fa0630d091a808249661f9d8161"} Apr 21 06:26:23.237211 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.237191 2570 generic.go:358] "Generic (PLEG): container finished" podID="aa39b975-a320-4be6-9871-173b44b3bf1a" containerID="a8fee49d5297270093c2e75e9baced38b2273573454304d3d87a37eeca347bfd" exitCode=0 Apr 21 06:26:23.237294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.237219 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9kgs6" event={"ID":"aa39b975-a320-4be6-9871-173b44b3bf1a","Type":"ContainerDied","Data":"a8fee49d5297270093c2e75e9baced38b2273573454304d3d87a37eeca347bfd"} Apr 21 06:26:23.247888 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.247837 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-z2xrz" podStartSLOduration=8.079924327 podStartE2EDuration="20.247823552s" podCreationTimestamp="2026-04-21 06:26:03 +0000 UTC" firstStartedPulling="2026-04-21 06:26:05.621737106 +0000 UTC m=+3.128202000" lastFinishedPulling="2026-04-21 06:26:17.789636328 +0000 UTC m=+15.296101225" observedRunningTime="2026-04-21 06:26:23.24736471 +0000 UTC m=+20.753829623" watchObservedRunningTime="2026-04-21 06:26:23.247823552 +0000 UTC m=+20.754288468" Apr 21 06:26:23.262981 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.262939 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7fmjt" podStartSLOduration=3.503998111 podStartE2EDuration="20.262925438s" podCreationTimestamp="2026-04-21 06:26:03 +0000 UTC" firstStartedPulling="2026-04-21 06:26:05.624492459 +0000 UTC m=+3.130957365" lastFinishedPulling="2026-04-21 06:26:22.383419784 +0000 UTC m=+19.889884692" observedRunningTime="2026-04-21 06:26:23.262738863 +0000 UTC m=+20.769203776" watchObservedRunningTime="2026-04-21 06:26:23.262925438 +0000 UTC m=+20.769390352" Apr 21 06:26:23.279538 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.279493 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bngnm" podStartSLOduration=3.4700489660000002 podStartE2EDuration="20.279479022s" podCreationTimestamp="2026-04-21 06:26:03 +0000 UTC" firstStartedPulling="2026-04-21 06:26:05.614311621 +0000 UTC m=+3.120776525" lastFinishedPulling="2026-04-21 06:26:22.423741683 +0000 UTC m=+19.930206581" observedRunningTime="2026-04-21 06:26:23.278996649 +0000 UTC m=+20.785461561" watchObservedRunningTime="2026-04-21 06:26:23.279479022 +0000 UTC m=+20.785943935" Apr 21 06:26:23.299141 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.299099 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8z4fb" podStartSLOduration=3.541414228 podStartE2EDuration="20.29908641s" podCreationTimestamp="2026-04-21 06:26:03 +0000 UTC" firstStartedPulling="2026-04-21 06:26:05.625757789 +0000 UTC m=+3.132222690" lastFinishedPulling="2026-04-21 06:26:22.383429977 +0000 UTC m=+19.889894872" observedRunningTime="2026-04-21 06:26:23.298976313 +0000 UTC m=+20.805441227" watchObservedRunningTime="2026-04-21 06:26:23.29908641 +0000 UTC m=+20.805551322" Apr 21 06:26:23.881329 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:23.881308 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 06:26:24.014168 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:24.013992 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T06:26:23.881324611Z","UUID":"a3abcac4-f503-47d5-b042-26d00a88d585","Handler":null,"Name":"","Endpoint":""} Apr 21 06:26:24.016332 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:24.016211 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 06:26:24.016332 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:24.016244 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 06:26:24.078819 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:24.078788 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:24.078966 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:24.078904 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frtp9" podUID="f9a14c21-e359-4c20-95a5-948922cc3ff8" Apr 21 06:26:24.241141 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:24.241108 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kwxms" event={"ID":"a1478fba-f9dc-413e-8354-ffdd0bcdaed2","Type":"ContainerStarted","Data":"fc2f6cdb053afc7bc40d0bb551ec219bd8c86342bdc45d5fecd802b7f61b153c"} Apr 21 06:26:24.244025 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:24.244003 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78qjr_4860a8de-8ebf-4c37-b025-9aaf165b999b/ovn-acl-logging/0.log" Apr 21 06:26:24.244474 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:24.244442 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" event={"ID":"4860a8de-8ebf-4c37-b025-9aaf165b999b","Type":"ContainerStarted","Data":"cacfdcec6d60390c9bdd1f2a8a2461eddde9d23dfa0f30d407482b320fffd058"} Apr 21 06:26:24.246185 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:24.246159 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" event={"ID":"0814d57e-a465-4787-8668-7b52f9ae671d","Type":"ContainerStarted","Data":"e632a198a5ba51f9f2a61e08b05610d0399cb66eef109303a17c7116a0f91037"} Apr 21 06:26:24.254714 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:24.254667 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-kwxms" podStartSLOduration=4.49910143 podStartE2EDuration="21.254651452s" podCreationTimestamp="2026-04-21 06:26:03 +0000 UTC" firstStartedPulling="2026-04-21 06:26:05.627909431 +0000 UTC m=+3.134374337" lastFinishedPulling="2026-04-21 06:26:22.383459467 +0000 UTC m=+19.889924359" observedRunningTime="2026-04-21 06:26:24.253822859 +0000 UTC m=+21.760287771" watchObservedRunningTime="2026-04-21 06:26:24.254651452 +0000 UTC m=+21.761116365" Apr 21 06:26:24.980668 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:24.980606 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-gh2h2" Apr 21 06:26:24.981283 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:24.981263 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-gh2h2" Apr 21 06:26:25.078444 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:25.078353 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:25.078591 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:25.078360 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:25.078591 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:25.078564 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pfzbp" podUID="aba5693c-c88c-45ce-9751-0d5e014097eb" Apr 21 06:26:25.078591 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:25.078483 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhdsz" podUID="d0022157-8720-4a4c-8cf0-324fe8cb0e3f" Apr 21 06:26:25.250318 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:25.250280 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" event={"ID":"0814d57e-a465-4787-8668-7b52f9ae671d","Type":"ContainerStarted","Data":"2627a22a4b72cfaabc9dd2f734a5f7e01dc2918676257c5db8fae033a2452fdc"} Apr 21 06:26:25.265402 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:25.265350 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-w9lpz" podStartSLOduration=3.083798286 podStartE2EDuration="22.265333105s" podCreationTimestamp="2026-04-21 06:26:03 +0000 UTC" firstStartedPulling="2026-04-21 06:26:05.6276244 +0000 UTC m=+3.134089304" lastFinishedPulling="2026-04-21 06:26:24.809159216 +0000 UTC m=+22.315624123" observedRunningTime="2026-04-21 06:26:25.264785323 +0000 UTC m=+22.771250239" watchObservedRunningTime="2026-04-21 06:26:25.265333105 +0000 UTC m=+22.771798018" Apr 21 06:26:26.078350 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:26.078314 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:26.078534 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:26.078424 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frtp9" podUID="f9a14c21-e359-4c20-95a5-948922cc3ff8" Apr 21 06:26:26.254967 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:26.254758 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78qjr_4860a8de-8ebf-4c37-b025-9aaf165b999b/ovn-acl-logging/0.log" Apr 21 06:26:26.255371 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:26.255338 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" event={"ID":"4860a8de-8ebf-4c37-b025-9aaf165b999b","Type":"ContainerStarted","Data":"83557e182a57e4eea71813176fcca0821d869bbdb9ceef0dc56f63f1d606464b"} Apr 21 06:26:26.255371 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:26.255356 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 06:26:27.078006 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:27.077973 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:27.078169 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:27.078020 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:27.078169 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:27.078109 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhdsz" podUID="d0022157-8720-4a4c-8cf0-324fe8cb0e3f" Apr 21 06:26:27.078290 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:27.078220 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pfzbp" podUID="aba5693c-c88c-45ce-9751-0d5e014097eb" Apr 21 06:26:28.078111 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:28.078079 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:28.078755 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:28.078174 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frtp9" podUID="f9a14c21-e359-4c20-95a5-948922cc3ff8" Apr 21 06:26:28.142403 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:28.142194 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-gh2h2" Apr 21 06:26:28.142550 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:28.142515 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 06:26:28.142765 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:28.142748 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-gh2h2" Apr 21 06:26:28.260293 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:28.260252 2570 generic.go:358] "Generic (PLEG): container finished" podID="aa39b975-a320-4be6-9871-173b44b3bf1a" containerID="b8cc0a4398f7e9650423baebe3554547646265de3804f8565c60494a6baf99c9" exitCode=0 Apr 21 06:26:28.260293 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:28.260285 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9kgs6" event={"ID":"aa39b975-a320-4be6-9871-173b44b3bf1a","Type":"ContainerDied","Data":"b8cc0a4398f7e9650423baebe3554547646265de3804f8565c60494a6baf99c9"} Apr 21 06:26:28.263867 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:28.263728 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78qjr_4860a8de-8ebf-4c37-b025-9aaf165b999b/ovn-acl-logging/0.log" Apr 21 06:26:28.264107 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:28.264087 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" event={"ID":"4860a8de-8ebf-4c37-b025-9aaf165b999b","Type":"ContainerStarted","Data":"11dcfe33351f595a3bdd031e88ba84021488fc7d32330dfa3b20dec2bb761604"} Apr 21 06:26:28.264401 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:28.264377 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:28.264506 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:28.264442 2570 scope.go:117] "RemoveContainer" containerID="eb33843846e7557584a71296d04038cf29b1d08d6f885cd4e8c12bad4450f324" Apr 21 06:26:28.278620 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:28.278603 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:29.078587 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:29.078555 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:29.078939 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:29.078590 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:29.078939 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:29.078701 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhdsz" podUID="d0022157-8720-4a4c-8cf0-324fe8cb0e3f" Apr 21 06:26:29.078939 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:29.078819 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pfzbp" podUID="aba5693c-c88c-45ce-9751-0d5e014097eb" Apr 21 06:26:29.271699 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:29.271635 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78qjr_4860a8de-8ebf-4c37-b025-9aaf165b999b/ovn-acl-logging/0.log" Apr 21 06:26:29.271992 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:29.271967 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" event={"ID":"4860a8de-8ebf-4c37-b025-9aaf165b999b","Type":"ContainerStarted","Data":"25260eda7a9420e98b90fc8a84a2dd51dbf551e79360ed05030707e67601ec99"} Apr 21 06:26:29.272087 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:29.272074 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 06:26:29.272307 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:29.272283 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:29.273812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:29.273791 2570 generic.go:358] "Generic (PLEG): container finished" podID="aa39b975-a320-4be6-9871-173b44b3bf1a" containerID="2adf906bc92b13670d95c886676b45c4c1df3a985c1d0fedce51de8e7315705a" exitCode=0 Apr 21 06:26:29.273913 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:29.273818 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9kgs6" event={"ID":"aa39b975-a320-4be6-9871-173b44b3bf1a","Type":"ContainerDied","Data":"2adf906bc92b13670d95c886676b45c4c1df3a985c1d0fedce51de8e7315705a"} Apr 21 06:26:29.286177 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:29.286157 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:29.296935 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:29.296887 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" podStartSLOduration=9.447207025 podStartE2EDuration="26.296873588s" podCreationTimestamp="2026-04-21 06:26:03 +0000 UTC" firstStartedPulling="2026-04-21 06:26:05.620161343 +0000 UTC m=+3.126626241" lastFinishedPulling="2026-04-21 06:26:22.469827899 +0000 UTC m=+19.976292804" observedRunningTime="2026-04-21 06:26:29.295463206 +0000 UTC m=+26.801928121" watchObservedRunningTime="2026-04-21 06:26:29.296873588 +0000 UTC m=+26.803338502" Apr 21 06:26:29.453833 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:29.453798 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pfzbp"] Apr 21 06:26:29.453980 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:29.453944 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:29.454070 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:29.454046 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pfzbp" podUID="aba5693c-c88c-45ce-9751-0d5e014097eb" Apr 21 06:26:29.456901 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:29.456873 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-frtp9"] Apr 21 06:26:29.457024 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:29.456982 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:29.457143 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:29.457123 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frtp9" podUID="f9a14c21-e359-4c20-95a5-948922cc3ff8" Apr 21 06:26:29.457477 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:29.457454 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xhdsz"] Apr 21 06:26:29.457567 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:29.457556 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:29.457666 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:29.457646 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhdsz" podUID="d0022157-8720-4a4c-8cf0-324fe8cb0e3f" Apr 21 06:26:30.277717 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:30.277635 2570 generic.go:358] "Generic (PLEG): container finished" podID="aa39b975-a320-4be6-9871-173b44b3bf1a" containerID="40be59fa1f65f6a762c2f37af537eb87b70b69450dfb2f54b0413be11801c076" exitCode=0 Apr 21 06:26:30.277717 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:30.277676 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9kgs6" event={"ID":"aa39b975-a320-4be6-9871-173b44b3bf1a","Type":"ContainerDied","Data":"40be59fa1f65f6a762c2f37af537eb87b70b69450dfb2f54b0413be11801c076"} Apr 21 06:26:30.278209 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:30.277980 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 06:26:31.078960 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:31.078927 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:31.079133 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:31.078936 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:31.079133 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:31.079053 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frtp9" podUID="f9a14c21-e359-4c20-95a5-948922cc3ff8" Apr 21 06:26:31.079133 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:31.078944 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:31.079283 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:31.079129 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhdsz" podUID="d0022157-8720-4a4c-8cf0-324fe8cb0e3f" Apr 21 06:26:31.079283 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:31.079207 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pfzbp" podUID="aba5693c-c88c-45ce-9751-0d5e014097eb" Apr 21 06:26:31.279948 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:31.279919 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 06:26:33.079685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:33.079655 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:33.080238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:33.079754 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:33.080238 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:33.079766 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pfzbp" podUID="aba5693c-c88c-45ce-9751-0d5e014097eb" Apr 21 06:26:33.080238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:33.079797 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:33.080238 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:33.079927 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhdsz" podUID="d0022157-8720-4a4c-8cf0-324fe8cb0e3f" Apr 21 06:26:33.080238 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:33.080020 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frtp9" podUID="f9a14c21-e359-4c20-95a5-948922cc3ff8" Apr 21 06:26:33.536160 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:33.536129 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:26:33.536356 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:33.536341 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 06:26:33.550171 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:33.550116 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" podUID="4860a8de-8ebf-4c37-b025-9aaf165b999b" containerName="ovnkube-controller" probeResult="failure" output="" Apr 21 06:26:33.558650 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:33.558620 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" podUID="4860a8de-8ebf-4c37-b025-9aaf165b999b" containerName="ovnkube-controller" probeResult="failure" output="" Apr 21 06:26:35.078690 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.078447 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:35.079251 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:35.078806 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhdsz" podUID="d0022157-8720-4a4c-8cf0-324fe8cb0e3f" Apr 21 06:26:35.079251 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.078813 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:35.079251 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:35.078922 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pfzbp" podUID="aba5693c-c88c-45ce-9751-0d5e014097eb" Apr 21 06:26:35.079419 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.079008 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:35.079472 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:35.079412 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frtp9" podUID="f9a14c21-e359-4c20-95a5-948922cc3ff8" Apr 21 06:26:35.355209 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.355120 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeReady" Apr 21 06:26:35.355374 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.355271 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 06:26:35.387928 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.387898 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-b5976c5f8-z5cl9"] Apr 21 06:26:35.418208 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.418178 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b5976c5f8-z5cl9"] Apr 21 06:26:35.418208 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.418211 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xc224"] Apr 21 06:26:35.418413 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.418342 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.421635 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.420804 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-9wmrl\"" Apr 21 06:26:35.421635 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.421197 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 06:26:35.421635 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.421367 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 06:26:35.421635 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.421539 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 06:26:35.427778 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.427467 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 06:26:35.434484 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.434459 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2pzsh"] Apr 21 06:26:35.434599 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.434579 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xc224" Apr 21 06:26:35.437031 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.436846 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gz466\"" Apr 21 06:26:35.437031 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.436910 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 06:26:35.437031 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.436950 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 06:26:35.454229 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.454199 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2pzsh"] Apr 21 06:26:35.454229 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.454231 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2pzsh" Apr 21 06:26:35.454415 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.454236 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xc224"] Apr 21 06:26:35.456384 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.456247 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 06:26:35.456384 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.456265 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 06:26:35.456384 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.456283 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-98jt7\"" Apr 21 06:26:35.456384 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.456247 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 06:26:35.502022 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.501991 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.502214 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.502043 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19cbf706-822a-4927-b18a-0621751d560e-installation-pull-secrets\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.502214 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.502068 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-bound-sa-token\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.502214 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.502136 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/19cbf706-822a-4927-b18a-0621751d560e-image-registry-private-configuration\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.502214 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.502180 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v5fr\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-kube-api-access-2v5fr\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.502420 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.502244 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19cbf706-822a-4927-b18a-0621751d560e-registry-certificates\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.502420 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.502278 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19cbf706-822a-4927-b18a-0621751d560e-ca-trust-extracted\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.502420 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.502311 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19cbf706-822a-4927-b18a-0621751d560e-trusted-ca\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.602807 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.602776 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcvqh\" (UniqueName: \"kubernetes.io/projected/e3794d28-61ca-4d8d-9d47-c634fc191844-kube-api-access-fcvqh\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:26:35.603002 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.602815 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19cbf706-822a-4927-b18a-0621751d560e-ca-trust-extracted\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.603002 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.602849 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert\") pod \"ingress-canary-2pzsh\" (UID: \"5a6936ad-93fd-4f26-83cc-7a94f1ebcac9\") " pod="openshift-ingress-canary/ingress-canary-2pzsh" Apr 21 06:26:35.603002 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.602894 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3794d28-61ca-4d8d-9d47-c634fc191844-config-volume\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:26:35.603002 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.602918 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e3794d28-61ca-4d8d-9d47-c634fc191844-tmp-dir\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:26:35.603002 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.602992 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19cbf706-822a-4927-b18a-0621751d560e-trusted-ca\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.603171 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.603057 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:26:35.603171 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.603107 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.603171 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.603135 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkfkp\" (UniqueName: \"kubernetes.io/projected/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-kube-api-access-dkfkp\") pod \"ingress-canary-2pzsh\" (UID: \"5a6936ad-93fd-4f26-83cc-7a94f1ebcac9\") " pod="openshift-ingress-canary/ingress-canary-2pzsh" Apr 21 06:26:35.603258 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.603184 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19cbf706-822a-4927-b18a-0621751d560e-installation-pull-secrets\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.603258 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.603208 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-bound-sa-token\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.603258 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.603251 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/19cbf706-822a-4927-b18a-0621751d560e-image-registry-private-configuration\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.603347 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.603283 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2v5fr\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-kube-api-access-2v5fr\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.603347 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.603297 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19cbf706-822a-4927-b18a-0621751d560e-ca-trust-extracted\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.603347 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.603314 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19cbf706-822a-4927-b18a-0621751d560e-registry-certificates\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.603744 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:35.603714 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 06:26:35.603744 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:35.603735 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b5976c5f8-z5cl9: secret "image-registry-tls" not found Apr 21 06:26:35.603965 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:35.603804 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls podName:19cbf706-822a-4927-b18a-0621751d560e nodeName:}" failed. No retries permitted until 2026-04-21 06:26:36.103779924 +0000 UTC m=+33.610244827 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls") pod "image-registry-b5976c5f8-z5cl9" (UID: "19cbf706-822a-4927-b18a-0621751d560e") : secret "image-registry-tls" not found Apr 21 06:26:35.603965 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.603907 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19cbf706-822a-4927-b18a-0621751d560e-trusted-ca\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.604124 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.604105 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19cbf706-822a-4927-b18a-0621751d560e-registry-certificates\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.607646 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.607586 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/19cbf706-822a-4927-b18a-0621751d560e-image-registry-private-configuration\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.607769 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.607642 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19cbf706-822a-4927-b18a-0621751d560e-installation-pull-secrets\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.614351 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.614302 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-bound-sa-token\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.614616 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.614589 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v5fr\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-kube-api-access-2v5fr\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:35.704067 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.704028 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert\") pod \"ingress-canary-2pzsh\" (UID: \"5a6936ad-93fd-4f26-83cc-7a94f1ebcac9\") " pod="openshift-ingress-canary/ingress-canary-2pzsh" Apr 21 06:26:35.704067 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.704067 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3794d28-61ca-4d8d-9d47-c634fc191844-config-volume\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:26:35.704322 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.704084 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e3794d28-61ca-4d8d-9d47-c634fc191844-tmp-dir\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:26:35.704322 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.704115 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:26:35.704322 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.704158 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkfkp\" (UniqueName: \"kubernetes.io/projected/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-kube-api-access-dkfkp\") pod \"ingress-canary-2pzsh\" (UID: \"5a6936ad-93fd-4f26-83cc-7a94f1ebcac9\") " pod="openshift-ingress-canary/ingress-canary-2pzsh" Apr 21 06:26:35.704322 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:35.704193 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:26:35.704322 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.704211 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcvqh\" (UniqueName: \"kubernetes.io/projected/e3794d28-61ca-4d8d-9d47-c634fc191844-kube-api-access-fcvqh\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:26:35.704322 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:35.704274 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:26:35.704322 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:35.704277 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert podName:5a6936ad-93fd-4f26-83cc-7a94f1ebcac9 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:36.204258676 +0000 UTC m=+33.710723573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert") pod "ingress-canary-2pzsh" (UID: "5a6936ad-93fd-4f26-83cc-7a94f1ebcac9") : secret "canary-serving-cert" not found Apr 21 06:26:35.704577 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:35.704345 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls podName:e3794d28-61ca-4d8d-9d47-c634fc191844 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:36.204319617 +0000 UTC m=+33.710784518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls") pod "dns-default-xc224" (UID: "e3794d28-61ca-4d8d-9d47-c634fc191844") : secret "dns-default-metrics-tls" not found Apr 21 06:26:35.704577 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.704403 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e3794d28-61ca-4d8d-9d47-c634fc191844-tmp-dir\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:26:35.704577 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.704563 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3794d28-61ca-4d8d-9d47-c634fc191844-config-volume\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:26:35.713395 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.713372 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcvqh\" (UniqueName: \"kubernetes.io/projected/e3794d28-61ca-4d8d-9d47-c634fc191844-kube-api-access-fcvqh\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:26:35.721723 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:35.721705 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkfkp\" (UniqueName: \"kubernetes.io/projected/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-kube-api-access-dkfkp\") pod \"ingress-canary-2pzsh\" (UID: \"5a6936ad-93fd-4f26-83cc-7a94f1ebcac9\") " pod="openshift-ingress-canary/ingress-canary-2pzsh" Apr 21 06:26:36.106594 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:36.106570 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:36.107033 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:36.106683 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 06:26:36.107033 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:36.106703 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b5976c5f8-z5cl9: secret "image-registry-tls" not found Apr 21 06:26:36.107033 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:36.106781 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls podName:19cbf706-822a-4927-b18a-0621751d560e nodeName:}" failed. No retries permitted until 2026-04-21 06:26:37.106766188 +0000 UTC m=+34.613231078 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls") pod "image-registry-b5976c5f8-z5cl9" (UID: "19cbf706-822a-4927-b18a-0621751d560e") : secret "image-registry-tls" not found Apr 21 06:26:36.206928 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:36.206899 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:26:36.207064 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:36.207008 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert\") pod \"ingress-canary-2pzsh\" (UID: \"5a6936ad-93fd-4f26-83cc-7a94f1ebcac9\") " pod="openshift-ingress-canary/ingress-canary-2pzsh" Apr 21 06:26:36.207064 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:36.207029 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:26:36.207172 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:36.207079 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls podName:e3794d28-61ca-4d8d-9d47-c634fc191844 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:37.207065278 +0000 UTC m=+34.713530170 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls") pod "dns-default-xc224" (UID: "e3794d28-61ca-4d8d-9d47-c634fc191844") : secret "dns-default-metrics-tls" not found Apr 21 06:26:36.207172 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:36.207111 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:26:36.207172 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:36.207160 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert podName:5a6936ad-93fd-4f26-83cc-7a94f1ebcac9 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:37.207143271 +0000 UTC m=+34.713608177 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert") pod "ingress-canary-2pzsh" (UID: "5a6936ad-93fd-4f26-83cc-7a94f1ebcac9") : secret "canary-serving-cert" not found Apr 21 06:26:36.293154 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:36.293115 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9kgs6" event={"ID":"aa39b975-a320-4be6-9871-173b44b3bf1a","Type":"ContainerStarted","Data":"19e38d4219ae6af1e81f7f82a16cd6a1c707a23580f2f3b5e1fed360c12c7073"} Apr 21 06:26:36.711112 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:36.711076 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret\") pod \"global-pull-secret-syncer-frtp9\" (UID: \"f9a14c21-e359-4c20-95a5-948922cc3ff8\") " pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:36.711353 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:36.711214 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:36.711353 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:36.711281 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret podName:f9a14c21-e359-4c20-95a5-948922cc3ff8 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:08.711263615 +0000 UTC m=+66.217728507 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret") pod "global-pull-secret-syncer-frtp9" (UID: "f9a14c21-e359-4c20-95a5-948922cc3ff8") : object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:36.812121 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:36.812091 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs\") pod \"network-metrics-daemon-xhdsz\" (UID: \"d0022157-8720-4a4c-8cf0-324fe8cb0e3f\") " pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:36.812264 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:36.812167 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsfgk\" (UniqueName: \"kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk\") pod \"network-check-target-pfzbp\" (UID: \"aba5693c-c88c-45ce-9751-0d5e014097eb\") " pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:36.812264 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:36.812240 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:36.812264 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:36.812260 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:36.812359 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:36.812274 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:36.812359 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:36.812285 2570 projected.go:194] Error preparing data for projected volume kube-api-access-fsfgk for pod openshift-network-diagnostics/network-check-target-pfzbp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:36.812359 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:36.812302 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs podName:d0022157-8720-4a4c-8cf0-324fe8cb0e3f nodeName:}" failed. No retries permitted until 2026-04-21 06:27:08.812285194 +0000 UTC m=+66.318750100 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs") pod "network-metrics-daemon-xhdsz" (UID: "d0022157-8720-4a4c-8cf0-324fe8cb0e3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:36.812359 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:36.812318 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk podName:aba5693c-c88c-45ce-9751-0d5e014097eb nodeName:}" failed. No retries permitted until 2026-04-21 06:27:08.812308073 +0000 UTC m=+66.318772964 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-fsfgk" (UniqueName: "kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk") pod "network-check-target-pfzbp" (UID: "aba5693c-c88c-45ce-9751-0d5e014097eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:37.078436 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:37.078356 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:26:37.078582 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:37.078356 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:26:37.078582 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:37.078375 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:26:37.080646 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:37.080564 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 06:26:37.080646 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:37.080623 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 06:26:37.080646 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:37.080623 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nvgbf\"" Apr 21 06:26:37.080890 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:37.080654 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 06:26:37.081187 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:37.081169 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 06:26:37.081296 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:37.081195 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p6jr2\"" Apr 21 06:26:37.114899 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:37.114872 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:37.115368 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:37.115001 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 06:26:37.115368 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:37.115016 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b5976c5f8-z5cl9: secret "image-registry-tls" not found Apr 21 06:26:37.115368 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:37.115074 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls podName:19cbf706-822a-4927-b18a-0621751d560e nodeName:}" failed. No retries permitted until 2026-04-21 06:26:39.115055367 +0000 UTC m=+36.621520264 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls") pod "image-registry-b5976c5f8-z5cl9" (UID: "19cbf706-822a-4927-b18a-0621751d560e") : secret "image-registry-tls" not found Apr 21 06:26:37.215423 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:37.215386 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert\") pod \"ingress-canary-2pzsh\" (UID: \"5a6936ad-93fd-4f26-83cc-7a94f1ebcac9\") " pod="openshift-ingress-canary/ingress-canary-2pzsh" Apr 21 06:26:37.215571 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:37.215438 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:26:37.215571 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:37.215536 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:26:37.215643 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:37.215588 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:26:37.215643 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:37.215616 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert podName:5a6936ad-93fd-4f26-83cc-7a94f1ebcac9 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:39.215594562 +0000 UTC m=+36.722059458 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert") pod "ingress-canary-2pzsh" (UID: "5a6936ad-93fd-4f26-83cc-7a94f1ebcac9") : secret "canary-serving-cert" not found Apr 21 06:26:37.215643 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:37.215632 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls podName:e3794d28-61ca-4d8d-9d47-c634fc191844 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:39.215624431 +0000 UTC m=+36.722089326 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls") pod "dns-default-xc224" (UID: "e3794d28-61ca-4d8d-9d47-c634fc191844") : secret "dns-default-metrics-tls" not found Apr 21 06:26:37.297715 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:37.297680 2570 generic.go:358] "Generic (PLEG): container finished" podID="aa39b975-a320-4be6-9871-173b44b3bf1a" containerID="19e38d4219ae6af1e81f7f82a16cd6a1c707a23580f2f3b5e1fed360c12c7073" exitCode=0 Apr 21 06:26:37.297890 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:37.297724 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9kgs6" event={"ID":"aa39b975-a320-4be6-9871-173b44b3bf1a","Type":"ContainerDied","Data":"19e38d4219ae6af1e81f7f82a16cd6a1c707a23580f2f3b5e1fed360c12c7073"} Apr 21 06:26:38.301607 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:38.301572 2570 generic.go:358] "Generic (PLEG): container finished" podID="aa39b975-a320-4be6-9871-173b44b3bf1a" containerID="0c50964aeedf1c6ad9b82fb7e2494bfdfe336a27837bd4a51ef5a0930f66d64c" exitCode=0 Apr 21 06:26:38.302182 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:38.301627 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9kgs6" event={"ID":"aa39b975-a320-4be6-9871-173b44b3bf1a","Type":"ContainerDied","Data":"0c50964aeedf1c6ad9b82fb7e2494bfdfe336a27837bd4a51ef5a0930f66d64c"} Apr 21 06:26:39.131084 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:39.131054 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:39.131241 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:39.131168 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 06:26:39.131241 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:39.131179 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b5976c5f8-z5cl9: secret "image-registry-tls" not found Apr 21 06:26:39.131241 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:39.131223 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls podName:19cbf706-822a-4927-b18a-0621751d560e nodeName:}" failed. No retries permitted until 2026-04-21 06:26:43.131210119 +0000 UTC m=+40.637675009 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls") pod "image-registry-b5976c5f8-z5cl9" (UID: "19cbf706-822a-4927-b18a-0621751d560e") : secret "image-registry-tls" not found Apr 21 06:26:39.232221 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:39.232188 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert\") pod \"ingress-canary-2pzsh\" (UID: \"5a6936ad-93fd-4f26-83cc-7a94f1ebcac9\") " pod="openshift-ingress-canary/ingress-canary-2pzsh" Apr 21 06:26:39.232365 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:39.232236 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:26:39.232365 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:39.232327 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:26:39.232435 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:39.232384 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert podName:5a6936ad-93fd-4f26-83cc-7a94f1ebcac9 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:43.232366887 +0000 UTC m=+40.738831778 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert") pod "ingress-canary-2pzsh" (UID: "5a6936ad-93fd-4f26-83cc-7a94f1ebcac9") : secret "canary-serving-cert" not found Apr 21 06:26:39.232435 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:39.232337 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:26:39.232516 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:39.232443 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls podName:e3794d28-61ca-4d8d-9d47-c634fc191844 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:43.23243335 +0000 UTC m=+40.738898242 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls") pod "dns-default-xc224" (UID: "e3794d28-61ca-4d8d-9d47-c634fc191844") : secret "dns-default-metrics-tls" not found Apr 21 06:26:39.306381 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:39.306314 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9kgs6" event={"ID":"aa39b975-a320-4be6-9871-173b44b3bf1a","Type":"ContainerStarted","Data":"fd15027f37ad031e8a550076cdb61bed1e16b612a2f7bd88f4997c99dd591354"} Apr 21 06:26:39.325809 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:39.325769 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9kgs6" podStartSLOduration=5.864678135 podStartE2EDuration="36.325756215s" podCreationTimestamp="2026-04-21 06:26:03 +0000 UTC" firstStartedPulling="2026-04-21 06:26:05.618484938 +0000 UTC m=+3.124949841" lastFinishedPulling="2026-04-21 06:26:36.079563024 +0000 UTC m=+33.586027921" observedRunningTime="2026-04-21 06:26:39.324661685 +0000 UTC m=+36.831126598" watchObservedRunningTime="2026-04-21 06:26:39.325756215 +0000 UTC m=+36.832221127" Apr 21 06:26:43.159408 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:43.159359 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:43.159883 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:43.159516 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 06:26:43.159883 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:43.159538 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b5976c5f8-z5cl9: secret "image-registry-tls" not found Apr 21 06:26:43.159883 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:43.159614 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls podName:19cbf706-822a-4927-b18a-0621751d560e nodeName:}" failed. No retries permitted until 2026-04-21 06:26:51.159593067 +0000 UTC m=+48.666057959 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls") pod "image-registry-b5976c5f8-z5cl9" (UID: "19cbf706-822a-4927-b18a-0621751d560e") : secret "image-registry-tls" not found Apr 21 06:26:43.259997 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:43.259951 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert\") pod \"ingress-canary-2pzsh\" (UID: \"5a6936ad-93fd-4f26-83cc-7a94f1ebcac9\") " pod="openshift-ingress-canary/ingress-canary-2pzsh" Apr 21 06:26:43.260168 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:43.260017 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:26:43.260168 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:43.260094 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:26:43.260168 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:43.260153 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert podName:5a6936ad-93fd-4f26-83cc-7a94f1ebcac9 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:51.26013762 +0000 UTC m=+48.766602512 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert") pod "ingress-canary-2pzsh" (UID: "5a6936ad-93fd-4f26-83cc-7a94f1ebcac9") : secret "canary-serving-cert" not found Apr 21 06:26:43.260294 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:43.260181 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:26:43.260294 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:43.260244 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls podName:e3794d28-61ca-4d8d-9d47-c634fc191844 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:51.260229939 +0000 UTC m=+48.766694830 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls") pod "dns-default-xc224" (UID: "e3794d28-61ca-4d8d-9d47-c634fc191844") : secret "dns-default-metrics-tls" not found Apr 21 06:26:51.220186 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:51.220147 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:26:51.220629 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:51.220278 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 06:26:51.220629 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:51.220291 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b5976c5f8-z5cl9: secret "image-registry-tls" not found Apr 21 06:26:51.220629 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:51.220340 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls podName:19cbf706-822a-4927-b18a-0621751d560e nodeName:}" failed. No retries permitted until 2026-04-21 06:27:07.22032684 +0000 UTC m=+64.726791730 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls") pod "image-registry-b5976c5f8-z5cl9" (UID: "19cbf706-822a-4927-b18a-0621751d560e") : secret "image-registry-tls" not found Apr 21 06:26:51.320809 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:51.320767 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert\") pod \"ingress-canary-2pzsh\" (UID: \"5a6936ad-93fd-4f26-83cc-7a94f1ebcac9\") " pod="openshift-ingress-canary/ingress-canary-2pzsh" Apr 21 06:26:51.320987 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:51.320826 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:26:51.320987 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:51.320930 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:26:51.321073 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:51.321007 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert podName:5a6936ad-93fd-4f26-83cc-7a94f1ebcac9 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:07.320990089 +0000 UTC m=+64.827454980 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert") pod "ingress-canary-2pzsh" (UID: "5a6936ad-93fd-4f26-83cc-7a94f1ebcac9") : secret "canary-serving-cert" not found Apr 21 06:26:51.321073 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:51.320930 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:26:51.321073 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:26:51.321071 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls podName:e3794d28-61ca-4d8d-9d47-c634fc191844 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:07.32105918 +0000 UTC m=+64.827524076 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls") pod "dns-default-xc224" (UID: "e3794d28-61ca-4d8d-9d47-c634fc191844") : secret "dns-default-metrics-tls" not found Apr 21 06:26:58.149306 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.149272 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw"] Apr 21 06:26:58.195264 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.195227 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw"] Apr 21 06:26:58.195406 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.195300 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.197657 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.197631 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 06:26:58.197803 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.197675 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 21 06:26:58.197803 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.197681 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 21 06:26:58.197803 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.197693 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 21 06:26:58.198334 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.198318 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 06:26:58.198334 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.198329 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 06:26:58.198448 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.198320 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 21 06:26:58.273721 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.273688 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ededebba-2d69-4d19-b62a-9a453f81d8d3-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-b7696c4bd-vp5qw\" (UID: \"ededebba-2d69-4d19-b62a-9a453f81d8d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.273898 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.273726 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf927\" (UniqueName: \"kubernetes.io/projected/ededebba-2d69-4d19-b62a-9a453f81d8d3-kube-api-access-nf927\") pod \"cluster-proxy-proxy-agent-b7696c4bd-vp5qw\" (UID: \"ededebba-2d69-4d19-b62a-9a453f81d8d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.273898 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.273789 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/ededebba-2d69-4d19-b62a-9a453f81d8d3-hub\") pod \"cluster-proxy-proxy-agent-b7696c4bd-vp5qw\" (UID: \"ededebba-2d69-4d19-b62a-9a453f81d8d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.274023 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.273956 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/ededebba-2d69-4d19-b62a-9a453f81d8d3-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-b7696c4bd-vp5qw\" (UID: \"ededebba-2d69-4d19-b62a-9a453f81d8d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.274023 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.273991 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/ededebba-2d69-4d19-b62a-9a453f81d8d3-ca\") pod \"cluster-proxy-proxy-agent-b7696c4bd-vp5qw\" (UID: \"ededebba-2d69-4d19-b62a-9a453f81d8d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.274023 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.274019 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/ededebba-2d69-4d19-b62a-9a453f81d8d3-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-b7696c4bd-vp5qw\" (UID: \"ededebba-2d69-4d19-b62a-9a453f81d8d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.375158 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.375127 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/ededebba-2d69-4d19-b62a-9a453f81d8d3-ca\") pod \"cluster-proxy-proxy-agent-b7696c4bd-vp5qw\" (UID: \"ededebba-2d69-4d19-b62a-9a453f81d8d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.375328 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.375166 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/ededebba-2d69-4d19-b62a-9a453f81d8d3-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-b7696c4bd-vp5qw\" (UID: \"ededebba-2d69-4d19-b62a-9a453f81d8d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.375328 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.375271 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ededebba-2d69-4d19-b62a-9a453f81d8d3-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-b7696c4bd-vp5qw\" (UID: \"ededebba-2d69-4d19-b62a-9a453f81d8d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.375328 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.375303 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nf927\" (UniqueName: \"kubernetes.io/projected/ededebba-2d69-4d19-b62a-9a453f81d8d3-kube-api-access-nf927\") pod \"cluster-proxy-proxy-agent-b7696c4bd-vp5qw\" (UID: \"ededebba-2d69-4d19-b62a-9a453f81d8d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.375328 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.375328 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/ededebba-2d69-4d19-b62a-9a453f81d8d3-hub\") pod \"cluster-proxy-proxy-agent-b7696c4bd-vp5qw\" (UID: \"ededebba-2d69-4d19-b62a-9a453f81d8d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.375522 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.375432 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/ededebba-2d69-4d19-b62a-9a453f81d8d3-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-b7696c4bd-vp5qw\" (UID: \"ededebba-2d69-4d19-b62a-9a453f81d8d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.376199 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.376172 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/ededebba-2d69-4d19-b62a-9a453f81d8d3-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-b7696c4bd-vp5qw\" (UID: \"ededebba-2d69-4d19-b62a-9a453f81d8d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.377733 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.377708 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/ededebba-2d69-4d19-b62a-9a453f81d8d3-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-b7696c4bd-vp5qw\" (UID: \"ededebba-2d69-4d19-b62a-9a453f81d8d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.377848 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.377791 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/ededebba-2d69-4d19-b62a-9a453f81d8d3-ca\") pod \"cluster-proxy-proxy-agent-b7696c4bd-vp5qw\" (UID: \"ededebba-2d69-4d19-b62a-9a453f81d8d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.378026 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.378006 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ededebba-2d69-4d19-b62a-9a453f81d8d3-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-b7696c4bd-vp5qw\" (UID: \"ededebba-2d69-4d19-b62a-9a453f81d8d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.378089 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.378005 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/ededebba-2d69-4d19-b62a-9a453f81d8d3-hub\") pod \"cluster-proxy-proxy-agent-b7696c4bd-vp5qw\" (UID: \"ededebba-2d69-4d19-b62a-9a453f81d8d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.390224 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.390202 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf927\" (UniqueName: \"kubernetes.io/projected/ededebba-2d69-4d19-b62a-9a453f81d8d3-kube-api-access-nf927\") pod \"cluster-proxy-proxy-agent-b7696c4bd-vp5qw\" (UID: \"ededebba-2d69-4d19-b62a-9a453f81d8d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.518787 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.518689 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:26:58.636470 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:58.636439 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw"] Apr 21 06:26:58.640158 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:26:58.640121 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podededebba_2d69_4d19_b62a_9a453f81d8d3.slice/crio-9dfcc724bc9a858f32e42361f4c29eac38820e2aaf21255e292eaa5667f31665 WatchSource:0}: Error finding container 9dfcc724bc9a858f32e42361f4c29eac38820e2aaf21255e292eaa5667f31665: Status 404 returned error can't find the container with id 9dfcc724bc9a858f32e42361f4c29eac38820e2aaf21255e292eaa5667f31665 Apr 21 06:26:59.342054 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:26:59.342006 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" event={"ID":"ededebba-2d69-4d19-b62a-9a453f81d8d3","Type":"ContainerStarted","Data":"9dfcc724bc9a858f32e42361f4c29eac38820e2aaf21255e292eaa5667f31665"} Apr 21 06:27:01.348161 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:01.348121 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" event={"ID":"ededebba-2d69-4d19-b62a-9a453f81d8d3","Type":"ContainerStarted","Data":"a579d69664dacc667daaaebdba15bcc49ff062fa6cc0aa13b1fc99d182e2a9d9"} Apr 21 06:27:03.563822 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:03.563795 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-78qjr" Apr 21 06:27:04.356690 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:04.356654 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" event={"ID":"ededebba-2d69-4d19-b62a-9a453f81d8d3","Type":"ContainerStarted","Data":"eb722fbac4d7b25eaba087c777af00ced3f8864ac85126e053a768c06fa9ad33"} Apr 21 06:27:04.356690 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:04.356695 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" event={"ID":"ededebba-2d69-4d19-b62a-9a453f81d8d3","Type":"ContainerStarted","Data":"2678e46db4d68d1e9d3c2c215cfb88767d5f03ed06c7d34c4fc56d8de2c84777"} Apr 21 06:27:04.372492 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:04.372443 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" podStartSLOduration=0.852320475 podStartE2EDuration="6.372427049s" podCreationTimestamp="2026-04-21 06:26:58 +0000 UTC" firstStartedPulling="2026-04-21 06:26:58.641359921 +0000 UTC m=+56.147824812" lastFinishedPulling="2026-04-21 06:27:04.161466474 +0000 UTC m=+61.667931386" observedRunningTime="2026-04-21 06:27:04.371337451 +0000 UTC m=+61.877802363" watchObservedRunningTime="2026-04-21 06:27:04.372427049 +0000 UTC m=+61.878891963" Apr 21 06:27:07.241710 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:07.241668 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:27:07.242118 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:27:07.241814 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 06:27:07.242118 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:27:07.241835 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b5976c5f8-z5cl9: secret "image-registry-tls" not found Apr 21 06:27:07.242118 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:27:07.241915 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls podName:19cbf706-822a-4927-b18a-0621751d560e nodeName:}" failed. No retries permitted until 2026-04-21 06:27:39.241898098 +0000 UTC m=+96.748362994 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls") pod "image-registry-b5976c5f8-z5cl9" (UID: "19cbf706-822a-4927-b18a-0621751d560e") : secret "image-registry-tls" not found Apr 21 06:27:07.342174 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:07.342125 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:27:07.342348 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:07.342230 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert\") pod \"ingress-canary-2pzsh\" (UID: \"5a6936ad-93fd-4f26-83cc-7a94f1ebcac9\") " pod="openshift-ingress-canary/ingress-canary-2pzsh" Apr 21 06:27:07.342348 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:27:07.342297 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:27:07.342348 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:27:07.342313 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:27:07.342468 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:27:07.342358 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert podName:5a6936ad-93fd-4f26-83cc-7a94f1ebcac9 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:39.342345636 +0000 UTC m=+96.848810527 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert") pod "ingress-canary-2pzsh" (UID: "5a6936ad-93fd-4f26-83cc-7a94f1ebcac9") : secret "canary-serving-cert" not found Apr 21 06:27:07.342468 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:27:07.342369 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls podName:e3794d28-61ca-4d8d-9d47-c634fc191844 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:39.342364139 +0000 UTC m=+96.848829030 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls") pod "dns-default-xc224" (UID: "e3794d28-61ca-4d8d-9d47-c634fc191844") : secret "dns-default-metrics-tls" not found Apr 21 06:27:08.753503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:08.753464 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret\") pod \"global-pull-secret-syncer-frtp9\" (UID: \"f9a14c21-e359-4c20-95a5-948922cc3ff8\") " pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:27:08.756014 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:08.755995 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 06:27:08.766797 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:08.766766 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f9a14c21-e359-4c20-95a5-948922cc3ff8-original-pull-secret\") pod \"global-pull-secret-syncer-frtp9\" (UID: \"f9a14c21-e359-4c20-95a5-948922cc3ff8\") " pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:27:08.854767 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:08.854731 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsfgk\" (UniqueName: \"kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk\") pod \"network-check-target-pfzbp\" (UID: \"aba5693c-c88c-45ce-9751-0d5e014097eb\") " pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:27:08.854948 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:08.854800 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs\") pod \"network-metrics-daemon-xhdsz\" (UID: \"d0022157-8720-4a4c-8cf0-324fe8cb0e3f\") " pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:27:08.857077 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:08.857054 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 06:27:08.857077 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:08.857066 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 06:27:08.865638 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:27:08.865620 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 06:27:08.865700 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:27:08.865678 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs podName:d0022157-8720-4a4c-8cf0-324fe8cb0e3f nodeName:}" failed. No retries permitted until 2026-04-21 06:28:12.865661758 +0000 UTC m=+130.372126653 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs") pod "network-metrics-daemon-xhdsz" (UID: "d0022157-8720-4a4c-8cf0-324fe8cb0e3f") : secret "metrics-daemon-secret" not found Apr 21 06:27:08.867218 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:08.867204 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 06:27:08.878173 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:08.878152 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsfgk\" (UniqueName: \"kubernetes.io/projected/aba5693c-c88c-45ce-9751-0d5e014097eb-kube-api-access-fsfgk\") pod \"network-check-target-pfzbp\" (UID: \"aba5693c-c88c-45ce-9751-0d5e014097eb\") " pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:27:08.890589 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:08.890571 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nvgbf\"" Apr 21 06:27:08.894032 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:08.894019 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frtp9" Apr 21 06:27:08.899712 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:08.899693 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:27:09.008876 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:09.008782 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-frtp9"] Apr 21 06:27:09.012253 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:27:09.012226 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9a14c21_e359_4c20_95a5_948922cc3ff8.slice/crio-d8395ecbf2b477b11a26d2e2d739c6c9f456e492d1a44061f5fc45a4ced8fe09 WatchSource:0}: Error finding container d8395ecbf2b477b11a26d2e2d739c6c9f456e492d1a44061f5fc45a4ced8fe09: Status 404 returned error can't find the container with id d8395ecbf2b477b11a26d2e2d739c6c9f456e492d1a44061f5fc45a4ced8fe09 Apr 21 06:27:09.026781 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:09.026757 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pfzbp"] Apr 21 06:27:09.040609 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:27:09.040589 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaba5693c_c88c_45ce_9751_0d5e014097eb.slice/crio-85523b1120335b2834cc6690e3a332540fe85c39180ba31a3c25bbc833bdba1a WatchSource:0}: Error finding container 85523b1120335b2834cc6690e3a332540fe85c39180ba31a3c25bbc833bdba1a: Status 404 returned error can't find the container with id 85523b1120335b2834cc6690e3a332540fe85c39180ba31a3c25bbc833bdba1a Apr 21 06:27:09.366488 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:09.366454 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-frtp9" event={"ID":"f9a14c21-e359-4c20-95a5-948922cc3ff8","Type":"ContainerStarted","Data":"d8395ecbf2b477b11a26d2e2d739c6c9f456e492d1a44061f5fc45a4ced8fe09"} Apr 21 06:27:09.367402 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:09.367380 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pfzbp" event={"ID":"aba5693c-c88c-45ce-9751-0d5e014097eb","Type":"ContainerStarted","Data":"85523b1120335b2834cc6690e3a332540fe85c39180ba31a3c25bbc833bdba1a"} Apr 21 06:27:14.378350 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:14.378309 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-frtp9" event={"ID":"f9a14c21-e359-4c20-95a5-948922cc3ff8","Type":"ContainerStarted","Data":"b243f32f395ed4edbf8a8886254818accd21e4ff93d38800fdc8273d29666618"} Apr 21 06:27:14.379649 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:14.379623 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pfzbp" event={"ID":"aba5693c-c88c-45ce-9751-0d5e014097eb","Type":"ContainerStarted","Data":"cf57578518a5931c79bb466542752d5326fbdb9cbc7b9556e0ceea5d86c774f9"} Apr 21 06:27:14.379758 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:14.379745 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:27:14.392725 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:14.392673 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-frtp9" podStartSLOduration=66.963689973 podStartE2EDuration="1m11.392658588s" podCreationTimestamp="2026-04-21 06:26:03 +0000 UTC" firstStartedPulling="2026-04-21 06:27:09.01397709 +0000 UTC m=+66.520441982" lastFinishedPulling="2026-04-21 06:27:13.442945692 +0000 UTC m=+70.949410597" observedRunningTime="2026-04-21 06:27:14.392159463 +0000 UTC m=+71.898624387" watchObservedRunningTime="2026-04-21 06:27:14.392658588 +0000 UTC m=+71.899123500" Apr 21 06:27:39.273086 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:39.272984 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls\") pod \"image-registry-b5976c5f8-z5cl9\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:27:39.273502 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:27:39.273109 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 06:27:39.273502 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:27:39.273128 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b5976c5f8-z5cl9: secret "image-registry-tls" not found Apr 21 06:27:39.273502 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:27:39.273211 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls podName:19cbf706-822a-4927-b18a-0621751d560e nodeName:}" failed. No retries permitted until 2026-04-21 06:28:43.273187972 +0000 UTC m=+160.779652864 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls") pod "image-registry-b5976c5f8-z5cl9" (UID: "19cbf706-822a-4927-b18a-0621751d560e") : secret "image-registry-tls" not found Apr 21 06:27:39.373563 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:39.373525 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert\") pod \"ingress-canary-2pzsh\" (UID: \"5a6936ad-93fd-4f26-83cc-7a94f1ebcac9\") " pod="openshift-ingress-canary/ingress-canary-2pzsh" Apr 21 06:27:39.373563 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:39.373567 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:27:39.373767 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:27:39.373664 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:27:39.373767 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:27:39.373676 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:27:39.373767 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:27:39.373722 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls podName:e3794d28-61ca-4d8d-9d47-c634fc191844 nodeName:}" failed. No retries permitted until 2026-04-21 06:28:43.373709194 +0000 UTC m=+160.880174086 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls") pod "dns-default-xc224" (UID: "e3794d28-61ca-4d8d-9d47-c634fc191844") : secret "dns-default-metrics-tls" not found Apr 21 06:27:39.373767 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:27:39.373734 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert podName:5a6936ad-93fd-4f26-83cc-7a94f1ebcac9 nodeName:}" failed. No retries permitted until 2026-04-21 06:28:43.373728514 +0000 UTC m=+160.880193405 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert") pod "ingress-canary-2pzsh" (UID: "5a6936ad-93fd-4f26-83cc-7a94f1ebcac9") : secret "canary-serving-cert" not found Apr 21 06:27:45.384911 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:45.384876 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-pfzbp" Apr 21 06:27:45.398483 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:45.398040 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-pfzbp" podStartSLOduration=98.003397051 podStartE2EDuration="1m42.398023186s" podCreationTimestamp="2026-04-21 06:26:03 +0000 UTC" firstStartedPulling="2026-04-21 06:27:09.04233024 +0000 UTC m=+66.548795134" lastFinishedPulling="2026-04-21 06:27:13.436956378 +0000 UTC m=+70.943421269" observedRunningTime="2026-04-21 06:27:14.40520912 +0000 UTC m=+71.911674043" watchObservedRunningTime="2026-04-21 06:27:45.398023186 +0000 UTC m=+102.904488101" Apr 21 06:27:48.895569 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:48.895540 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8z4fb_ea78721d-4fb0-4884-9dfd-d0be9bbc750b/dns-node-resolver/0.log" Apr 21 06:27:50.096914 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:27:50.096888 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-z2xrz_ada9d0db-7f80-4159-9e92-7fe71d0647f6/node-ca/0.log" Apr 21 06:28:09.237915 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.237878 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-85mq4"] Apr 21 06:28:09.241108 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.241093 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-85mq4" Apr 21 06:28:09.243540 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.243518 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 06:28:09.244479 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.244456 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 06:28:09.244610 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.244590 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 06:28:09.244710 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.244695 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-7w99g\"" Apr 21 06:28:09.244763 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.244738 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 06:28:09.253989 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.253961 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-85mq4"] Apr 21 06:28:09.393390 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.393352 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c570aac3-9017-4847-9f0d-2051fa8a7f0f-data-volume\") pod \"insights-runtime-extractor-85mq4\" (UID: \"c570aac3-9017-4847-9f0d-2051fa8a7f0f\") " pod="openshift-insights/insights-runtime-extractor-85mq4" Apr 21 06:28:09.393558 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.393400 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c570aac3-9017-4847-9f0d-2051fa8a7f0f-crio-socket\") pod \"insights-runtime-extractor-85mq4\" (UID: \"c570aac3-9017-4847-9f0d-2051fa8a7f0f\") " pod="openshift-insights/insights-runtime-extractor-85mq4" Apr 21 06:28:09.393558 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.393422 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c570aac3-9017-4847-9f0d-2051fa8a7f0f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-85mq4\" (UID: \"c570aac3-9017-4847-9f0d-2051fa8a7f0f\") " pod="openshift-insights/insights-runtime-extractor-85mq4" Apr 21 06:28:09.393558 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.393490 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c570aac3-9017-4847-9f0d-2051fa8a7f0f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-85mq4\" (UID: \"c570aac3-9017-4847-9f0d-2051fa8a7f0f\") " pod="openshift-insights/insights-runtime-extractor-85mq4" Apr 21 06:28:09.393558 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.393519 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gtrk\" (UniqueName: \"kubernetes.io/projected/c570aac3-9017-4847-9f0d-2051fa8a7f0f-kube-api-access-4gtrk\") pod \"insights-runtime-extractor-85mq4\" (UID: \"c570aac3-9017-4847-9f0d-2051fa8a7f0f\") " pod="openshift-insights/insights-runtime-extractor-85mq4" Apr 21 06:28:09.494344 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.494252 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c570aac3-9017-4847-9f0d-2051fa8a7f0f-data-volume\") pod \"insights-runtime-extractor-85mq4\" (UID: \"c570aac3-9017-4847-9f0d-2051fa8a7f0f\") " pod="openshift-insights/insights-runtime-extractor-85mq4" Apr 21 06:28:09.494344 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.494314 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c570aac3-9017-4847-9f0d-2051fa8a7f0f-crio-socket\") pod \"insights-runtime-extractor-85mq4\" (UID: \"c570aac3-9017-4847-9f0d-2051fa8a7f0f\") " pod="openshift-insights/insights-runtime-extractor-85mq4" Apr 21 06:28:09.494344 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.494334 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c570aac3-9017-4847-9f0d-2051fa8a7f0f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-85mq4\" (UID: \"c570aac3-9017-4847-9f0d-2051fa8a7f0f\") " pod="openshift-insights/insights-runtime-extractor-85mq4" Apr 21 06:28:09.494592 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.494364 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c570aac3-9017-4847-9f0d-2051fa8a7f0f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-85mq4\" (UID: \"c570aac3-9017-4847-9f0d-2051fa8a7f0f\") " pod="openshift-insights/insights-runtime-extractor-85mq4" Apr 21 06:28:09.494592 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.494381 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gtrk\" (UniqueName: \"kubernetes.io/projected/c570aac3-9017-4847-9f0d-2051fa8a7f0f-kube-api-access-4gtrk\") pod \"insights-runtime-extractor-85mq4\" (UID: \"c570aac3-9017-4847-9f0d-2051fa8a7f0f\") " pod="openshift-insights/insights-runtime-extractor-85mq4" Apr 21 06:28:09.494592 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.494505 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c570aac3-9017-4847-9f0d-2051fa8a7f0f-crio-socket\") pod \"insights-runtime-extractor-85mq4\" (UID: \"c570aac3-9017-4847-9f0d-2051fa8a7f0f\") " pod="openshift-insights/insights-runtime-extractor-85mq4" Apr 21 06:28:09.494685 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.494652 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c570aac3-9017-4847-9f0d-2051fa8a7f0f-data-volume\") pod \"insights-runtime-extractor-85mq4\" (UID: \"c570aac3-9017-4847-9f0d-2051fa8a7f0f\") " pod="openshift-insights/insights-runtime-extractor-85mq4" Apr 21 06:28:09.494915 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.494897 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c570aac3-9017-4847-9f0d-2051fa8a7f0f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-85mq4\" (UID: \"c570aac3-9017-4847-9f0d-2051fa8a7f0f\") " pod="openshift-insights/insights-runtime-extractor-85mq4" Apr 21 06:28:09.496646 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.496625 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c570aac3-9017-4847-9f0d-2051fa8a7f0f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-85mq4\" (UID: \"c570aac3-9017-4847-9f0d-2051fa8a7f0f\") " pod="openshift-insights/insights-runtime-extractor-85mq4" Apr 21 06:28:09.508905 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.508878 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gtrk\" (UniqueName: \"kubernetes.io/projected/c570aac3-9017-4847-9f0d-2051fa8a7f0f-kube-api-access-4gtrk\") pod \"insights-runtime-extractor-85mq4\" (UID: \"c570aac3-9017-4847-9f0d-2051fa8a7f0f\") " pod="openshift-insights/insights-runtime-extractor-85mq4" Apr 21 06:28:09.549657 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.549629 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-85mq4" Apr 21 06:28:09.667516 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:09.667484 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-85mq4"] Apr 21 06:28:09.671072 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:28:09.671042 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc570aac3_9017_4847_9f0d_2051fa8a7f0f.slice/crio-8148ab5e3206c204641bc35169ac675cdff60f84c3ae2dd4ae9159f2ac454f81 WatchSource:0}: Error finding container 8148ab5e3206c204641bc35169ac675cdff60f84c3ae2dd4ae9159f2ac454f81: Status 404 returned error can't find the container with id 8148ab5e3206c204641bc35169ac675cdff60f84c3ae2dd4ae9159f2ac454f81 Apr 21 06:28:10.486793 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:10.486759 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-85mq4" event={"ID":"c570aac3-9017-4847-9f0d-2051fa8a7f0f","Type":"ContainerStarted","Data":"335854976ef28f0d5c34555a05ae083f9091e533b79af6d605e0768066287234"} Apr 21 06:28:10.487175 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:10.486794 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-85mq4" event={"ID":"c570aac3-9017-4847-9f0d-2051fa8a7f0f","Type":"ContainerStarted","Data":"1f679a114871d58f05e0f31939719ffaf52e03fca97873296d45f84500ff056e"} Apr 21 06:28:10.487175 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:10.486813 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-85mq4" event={"ID":"c570aac3-9017-4847-9f0d-2051fa8a7f0f","Type":"ContainerStarted","Data":"8148ab5e3206c204641bc35169ac675cdff60f84c3ae2dd4ae9159f2ac454f81"} Apr 21 06:28:12.493104 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:12.493060 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-85mq4" event={"ID":"c570aac3-9017-4847-9f0d-2051fa8a7f0f","Type":"ContainerStarted","Data":"2a139b3bfa19b5e1a4e737558894bbcdc7f4914a701d04e22f2f2be881d4b17a"} Apr 21 06:28:12.509589 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:12.509542 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-85mq4" podStartSLOduration=1.661803598 podStartE2EDuration="3.509527955s" podCreationTimestamp="2026-04-21 06:28:09 +0000 UTC" firstStartedPulling="2026-04-21 06:28:09.74397654 +0000 UTC m=+127.250441435" lastFinishedPulling="2026-04-21 06:28:11.591700897 +0000 UTC m=+129.098165792" observedRunningTime="2026-04-21 06:28:12.507835153 +0000 UTC m=+130.014300065" watchObservedRunningTime="2026-04-21 06:28:12.509527955 +0000 UTC m=+130.015993277" Apr 21 06:28:12.920678 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:12.920647 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs\") pod \"network-metrics-daemon-xhdsz\" (UID: \"d0022157-8720-4a4c-8cf0-324fe8cb0e3f\") " pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:28:12.923000 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:12.922970 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0022157-8720-4a4c-8cf0-324fe8cb0e3f-metrics-certs\") pod \"network-metrics-daemon-xhdsz\" (UID: \"d0022157-8720-4a4c-8cf0-324fe8cb0e3f\") " pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:28:13.100656 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:13.100626 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p6jr2\"" Apr 21 06:28:13.109426 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:13.109408 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhdsz" Apr 21 06:28:13.220096 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:13.220065 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xhdsz"] Apr 21 06:28:13.223634 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:28:13.223592 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0022157_8720_4a4c_8cf0_324fe8cb0e3f.slice/crio-7b24bb4b329a2fb8cad7989e2366a9cc75ff0e8d6542d1106a6441eab7017942 WatchSource:0}: Error finding container 7b24bb4b329a2fb8cad7989e2366a9cc75ff0e8d6542d1106a6441eab7017942: Status 404 returned error can't find the container with id 7b24bb4b329a2fb8cad7989e2366a9cc75ff0e8d6542d1106a6441eab7017942 Apr 21 06:28:13.496717 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:13.496632 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xhdsz" event={"ID":"d0022157-8720-4a4c-8cf0-324fe8cb0e3f","Type":"ContainerStarted","Data":"7b24bb4b329a2fb8cad7989e2366a9cc75ff0e8d6542d1106a6441eab7017942"} Apr 21 06:28:14.501365 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:14.501271 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xhdsz" event={"ID":"d0022157-8720-4a4c-8cf0-324fe8cb0e3f","Type":"ContainerStarted","Data":"740fcf0de8688ca57f70d5f303bf7a4cd956c46b7292668e856f23321adc4faf"} Apr 21 06:28:14.501365 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:14.501315 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xhdsz" event={"ID":"d0022157-8720-4a4c-8cf0-324fe8cb0e3f","Type":"ContainerStarted","Data":"a505ba913552477fe10024c488d626d1110d3e5b1ccde3bd8696e8be079f09c7"} Apr 21 06:28:14.515718 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:14.515497 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xhdsz" podStartSLOduration=130.597475863 podStartE2EDuration="2m11.51547828s" podCreationTimestamp="2026-04-21 06:26:03 +0000 UTC" firstStartedPulling="2026-04-21 06:28:13.225520427 +0000 UTC m=+130.731985319" lastFinishedPulling="2026-04-21 06:28:14.143522845 +0000 UTC m=+131.649987736" observedRunningTime="2026-04-21 06:28:14.514517907 +0000 UTC m=+132.020982820" watchObservedRunningTime="2026-04-21 06:28:14.51547828 +0000 UTC m=+132.021943197" Apr 21 06:28:16.258708 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:16.258675 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zz8fg"] Apr 21 06:28:16.261508 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:16.261493 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zz8fg" Apr 21 06:28:16.263790 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:16.263768 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 06:28:16.264058 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:16.264043 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-r7ptz\"" Apr 21 06:28:16.268284 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:16.268262 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zz8fg"] Apr 21 06:28:16.346075 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:16.346038 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/908bb8a3-b3fd-4ad3-8194-5f7d51de620d-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zz8fg\" (UID: \"908bb8a3-b3fd-4ad3-8194-5f7d51de620d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zz8fg" Apr 21 06:28:16.446689 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:16.446650 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/908bb8a3-b3fd-4ad3-8194-5f7d51de620d-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zz8fg\" (UID: \"908bb8a3-b3fd-4ad3-8194-5f7d51de620d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zz8fg" Apr 21 06:28:16.446840 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:28:16.446794 2570 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 21 06:28:16.446961 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:28:16.446867 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/908bb8a3-b3fd-4ad3-8194-5f7d51de620d-tls-certificates podName:908bb8a3-b3fd-4ad3-8194-5f7d51de620d nodeName:}" failed. No retries permitted until 2026-04-21 06:28:16.946837045 +0000 UTC m=+134.453301941 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/908bb8a3-b3fd-4ad3-8194-5f7d51de620d-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-zz8fg" (UID: "908bb8a3-b3fd-4ad3-8194-5f7d51de620d") : secret "prometheus-operator-admission-webhook-tls" not found Apr 21 06:28:16.949930 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:16.949871 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/908bb8a3-b3fd-4ad3-8194-5f7d51de620d-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zz8fg\" (UID: \"908bb8a3-b3fd-4ad3-8194-5f7d51de620d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zz8fg" Apr 21 06:28:16.952307 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:16.952279 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/908bb8a3-b3fd-4ad3-8194-5f7d51de620d-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zz8fg\" (UID: \"908bb8a3-b3fd-4ad3-8194-5f7d51de620d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zz8fg" Apr 21 06:28:17.169994 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:17.169942 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zz8fg" Apr 21 06:28:17.281052 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:17.281016 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zz8fg"] Apr 21 06:28:17.284873 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:28:17.284835 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod908bb8a3_b3fd_4ad3_8194_5f7d51de620d.slice/crio-3c37dbafd03a8de0a6779d0f79be4624fb7e0b2f8dd47a21400f5efb93359cb7 WatchSource:0}: Error finding container 3c37dbafd03a8de0a6779d0f79be4624fb7e0b2f8dd47a21400f5efb93359cb7: Status 404 returned error can't find the container with id 3c37dbafd03a8de0a6779d0f79be4624fb7e0b2f8dd47a21400f5efb93359cb7 Apr 21 06:28:17.512737 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:17.512658 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zz8fg" event={"ID":"908bb8a3-b3fd-4ad3-8194-5f7d51de620d","Type":"ContainerStarted","Data":"3c37dbafd03a8de0a6779d0f79be4624fb7e0b2f8dd47a21400f5efb93359cb7"} Apr 21 06:28:18.518469 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:18.518383 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zz8fg" event={"ID":"908bb8a3-b3fd-4ad3-8194-5f7d51de620d","Type":"ContainerStarted","Data":"ad91a6d0697a4e0393771944628ac520de4b7d1ff581290e439fcafdb01d6f76"} Apr 21 06:28:18.518901 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:18.518590 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zz8fg" Apr 21 06:28:18.519585 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:18.519550 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" podUID="ededebba-2d69-4d19-b62a-9a453f81d8d3" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 06:28:18.523678 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:18.523660 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zz8fg" Apr 21 06:28:18.532380 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:18.532334 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zz8fg" podStartSLOduration=1.594766586 podStartE2EDuration="2.532319686s" podCreationTimestamp="2026-04-21 06:28:16 +0000 UTC" firstStartedPulling="2026-04-21 06:28:17.286643128 +0000 UTC m=+134.793108022" lastFinishedPulling="2026-04-21 06:28:18.224196231 +0000 UTC m=+135.730661122" observedRunningTime="2026-04-21 06:28:18.53138367 +0000 UTC m=+136.037848582" watchObservedRunningTime="2026-04-21 06:28:18.532319686 +0000 UTC m=+136.038784600" Apr 21 06:28:24.675957 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.675924 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qqfpq"] Apr 21 06:28:24.679197 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.679179 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.681414 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.681397 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 06:28:24.681673 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.681650 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7m4pt\"" Apr 21 06:28:24.682250 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.682233 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 06:28:24.682250 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.682245 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 06:28:24.682403 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.682259 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 06:28:24.682403 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.682258 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 06:28:24.682403 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.682305 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 06:28:24.702477 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.702457 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-node-exporter-textfile\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.702572 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.702497 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-node-exporter-accelerators-collector-config\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.702572 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.702526 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-sys\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.702572 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.702551 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-node-exporter-wtmp\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.702683 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.702590 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.702683 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.702636 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-node-exporter-tls\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.702683 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.702663 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-metrics-client-ca\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.702780 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.702688 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdhft\" (UniqueName: \"kubernetes.io/projected/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-kube-api-access-jdhft\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.702780 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.702710 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-root\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.803738 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.803697 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.803941 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.803772 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-node-exporter-tls\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.803941 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.803803 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-metrics-client-ca\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.803941 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.803821 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdhft\" (UniqueName: \"kubernetes.io/projected/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-kube-api-access-jdhft\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.803941 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.803838 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-root\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.803941 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.803883 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-node-exporter-textfile\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.803941 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.803935 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-node-exporter-accelerators-collector-config\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.804258 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:28:24.803962 2570 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 06:28:24.804258 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.803983 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-root\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.804258 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.804002 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-sys\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.804258 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.803965 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-sys\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.804258 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:28:24.804030 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-node-exporter-tls podName:0046ecc4-fbf2-441f-a0a9-c7b79d713ced nodeName:}" failed. No retries permitted until 2026-04-21 06:28:25.304009552 +0000 UTC m=+142.810474454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-node-exporter-tls") pod "node-exporter-qqfpq" (UID: "0046ecc4-fbf2-441f-a0a9-c7b79d713ced") : secret "node-exporter-tls" not found Apr 21 06:28:24.804258 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.804115 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-node-exporter-wtmp\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.804541 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.804289 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-node-exporter-wtmp\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.804541 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.804356 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-node-exporter-textfile\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.804541 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.804519 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-node-exporter-accelerators-collector-config\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.804541 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.804534 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-metrics-client-ca\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.806189 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.806170 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:24.812171 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:24.812148 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdhft\" (UniqueName: \"kubernetes.io/projected/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-kube-api-access-jdhft\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:25.308713 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.308675 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-node-exporter-tls\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:25.310895 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.310874 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0046ecc4-fbf2-441f-a0a9-c7b79d713ced-node-exporter-tls\") pod \"node-exporter-qqfpq\" (UID: \"0046ecc4-fbf2-441f-a0a9-c7b79d713ced\") " pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:25.588277 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.588196 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qqfpq" Apr 21 06:28:25.597125 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:28:25.597080 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0046ecc4_fbf2_441f_a0a9_c7b79d713ced.slice/crio-e95a35de96cd29300034fd663e56095fea531b2e3c41f58fbd33d80aa22774ca WatchSource:0}: Error finding container e95a35de96cd29300034fd663e56095fea531b2e3c41f58fbd33d80aa22774ca: Status 404 returned error can't find the container with id e95a35de96cd29300034fd663e56095fea531b2e3c41f58fbd33d80aa22774ca Apr 21 06:28:25.768112 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.768060 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 06:28:25.772743 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.772724 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.774993 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.774966 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 06:28:25.775357 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.775334 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-lv9bw\"" Apr 21 06:28:25.775464 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.775390 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 06:28:25.775464 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.775398 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 06:28:25.775573 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.775522 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 06:28:25.775724 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.775708 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 06:28:25.775890 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.775874 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 06:28:25.775953 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.775895 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 06:28:25.775953 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.775879 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 06:28:25.776280 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.776263 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 06:28:25.785475 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.785454 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 06:28:25.812841 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.812800 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-config-volume\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.813025 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.812846 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59f06050-163a-4720-b0dd-b3a1b905e054-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.813025 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.812932 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/59f06050-163a-4720-b0dd-b3a1b905e054-config-out\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.813025 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.812962 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/59f06050-163a-4720-b0dd-b3a1b905e054-tls-assets\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.813025 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.812987 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.813025 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.813016 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-web-config\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.813183 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.813064 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.813183 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.813096 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.813183 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.813123 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59f06050-163a-4720-b0dd-b3a1b905e054-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.813183 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.813144 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/59f06050-163a-4720-b0dd-b3a1b905e054-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.813324 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.813202 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.813324 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.813224 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts7g7\" (UniqueName: \"kubernetes.io/projected/59f06050-163a-4720-b0dd-b3a1b905e054-kube-api-access-ts7g7\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.813324 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.813249 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.914523 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.914488 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-config-volume\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.914523 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.914523 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59f06050-163a-4720-b0dd-b3a1b905e054-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.914766 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.914542 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/59f06050-163a-4720-b0dd-b3a1b905e054-config-out\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.914766 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.914562 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/59f06050-163a-4720-b0dd-b3a1b905e054-tls-assets\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.914766 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.914579 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.914766 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.914601 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-web-config\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.914766 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.914631 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.914766 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.914661 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.914766 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.914693 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59f06050-163a-4720-b0dd-b3a1b905e054-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.914766 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:28:25.914704 2570 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 21 06:28:25.914766 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.914726 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/59f06050-163a-4720-b0dd-b3a1b905e054-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.915185 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:28:25.914784 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-main-tls podName:59f06050-163a-4720-b0dd-b3a1b905e054 nodeName:}" failed. No retries permitted until 2026-04-21 06:28:26.414763156 +0000 UTC m=+143.921228048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "59f06050-163a-4720-b0dd-b3a1b905e054") : secret "alertmanager-main-tls" not found Apr 21 06:28:25.915185 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.914842 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.915185 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.914909 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ts7g7\" (UniqueName: \"kubernetes.io/projected/59f06050-163a-4720-b0dd-b3a1b905e054-kube-api-access-ts7g7\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.915185 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.914942 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.915185 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.915113 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/59f06050-163a-4720-b0dd-b3a1b905e054-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.915465 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.915433 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59f06050-163a-4720-b0dd-b3a1b905e054-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.917575 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.917543 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/59f06050-163a-4720-b0dd-b3a1b905e054-config-out\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.917704 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:28:25.917685 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/59f06050-163a-4720-b0dd-b3a1b905e054-alertmanager-trusted-ca-bundle podName:59f06050-163a-4720-b0dd-b3a1b905e054 nodeName:}" failed. No retries permitted until 2026-04-21 06:28:26.417668368 +0000 UTC m=+143.924133288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/59f06050-163a-4720-b0dd-b3a1b905e054-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "59f06050-163a-4720-b0dd-b3a1b905e054") : configmap references non-existent config key: ca-bundle.crt Apr 21 06:28:25.917772 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.917709 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.917832 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.917795 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/59f06050-163a-4720-b0dd-b3a1b905e054-tls-assets\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.918005 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.917980 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.918095 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.918052 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-web-config\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.918095 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.918062 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.918214 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.918199 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.918280 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.918265 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-config-volume\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:25.928445 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:25.928418 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts7g7\" (UniqueName: \"kubernetes.io/projected/59f06050-163a-4720-b0dd-b3a1b905e054-kube-api-access-ts7g7\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:26.420346 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.420311 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:26.420459 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.420369 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59f06050-163a-4720-b0dd-b3a1b905e054-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:26.421063 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.421039 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59f06050-163a-4720-b0dd-b3a1b905e054-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:26.422427 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.422405 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:26.539171 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.539083 2570 generic.go:358] "Generic (PLEG): container finished" podID="0046ecc4-fbf2-441f-a0a9-c7b79d713ced" containerID="9ddb1f3f1a1487f8d57ad80487e2b4d8a7342ec5249b6997c0472190f81aa33d" exitCode=0 Apr 21 06:28:26.539171 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.539134 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qqfpq" event={"ID":"0046ecc4-fbf2-441f-a0a9-c7b79d713ced","Type":"ContainerDied","Data":"9ddb1f3f1a1487f8d57ad80487e2b4d8a7342ec5249b6997c0472190f81aa33d"} Apr 21 06:28:26.539171 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.539159 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qqfpq" event={"ID":"0046ecc4-fbf2-441f-a0a9-c7b79d713ced","Type":"ContainerStarted","Data":"e95a35de96cd29300034fd663e56095fea531b2e3c41f58fbd33d80aa22774ca"} Apr 21 06:28:26.682023 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.681992 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:28:26.809552 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.809471 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-959989d7d-v9t94"] Apr 21 06:28:26.814009 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.813990 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:26.816278 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.816251 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 06:28:26.816409 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.816294 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 06:28:26.816409 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.816305 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 06:28:26.816409 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.816308 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-h5wgt\"" Apr 21 06:28:26.816604 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.816589 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 06:28:26.816649 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.816610 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 06:28:26.816649 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.816610 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-8ngdfrmpbo8q2\"" Apr 21 06:28:26.821951 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.821929 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-959989d7d-v9t94"] Apr 21 06:28:26.827795 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.827751 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 06:28:26.830423 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:28:26.830400 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59f06050_163a_4720_b0dd_b3a1b905e054.slice/crio-b3c55945061f480b099f673083cbabf6eca711c3e41dd01e8777c9d17cc757e5 WatchSource:0}: Error finding container b3c55945061f480b099f673083cbabf6eca711c3e41dd01e8777c9d17cc757e5: Status 404 returned error can't find the container with id b3c55945061f480b099f673083cbabf6eca711c3e41dd01e8777c9d17cc757e5 Apr 21 06:28:26.924099 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.924061 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-secret-grpc-tls\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:26.924099 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.924105 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgtfr\" (UniqueName: \"kubernetes.io/projected/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-kube-api-access-vgtfr\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:26.924435 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.924128 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:26.924435 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.924175 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-secret-thanos-querier-tls\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:26.924435 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.924195 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:26.924435 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.924213 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-metrics-client-ca\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:26.924435 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.924338 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:26.924435 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:26.924384 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:27.025399 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.025353 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-secret-grpc-tls\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:27.025399 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.025405 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgtfr\" (UniqueName: \"kubernetes.io/projected/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-kube-api-access-vgtfr\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:27.025659 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.025426 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:27.025659 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.025548 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-secret-thanos-querier-tls\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:27.025659 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.025589 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:27.025659 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.025624 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-metrics-client-ca\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:27.025882 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.025712 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:27.025882 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.025756 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:27.026478 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.026454 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-metrics-client-ca\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:27.028412 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.028382 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:27.028508 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.028388 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-secret-thanos-querier-tls\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:27.028508 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.028497 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:27.028584 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.028559 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:27.028645 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.028630 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-secret-grpc-tls\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:27.028680 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.028647 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:27.033677 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.033655 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgtfr\" (UniqueName: \"kubernetes.io/projected/3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b-kube-api-access-vgtfr\") pod \"thanos-querier-959989d7d-v9t94\" (UID: \"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b\") " pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:27.124112 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.124079 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:27.256669 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.256633 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-959989d7d-v9t94"] Apr 21 06:28:27.259831 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:28:27.259800 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c03c2ab_3a81_4b9f_9a90_e4ef5d38303b.slice/crio-9679816377a85b4cd8c815c6f3b0a18985b9d171904b5aabce11112278f39f00 WatchSource:0}: Error finding container 9679816377a85b4cd8c815c6f3b0a18985b9d171904b5aabce11112278f39f00: Status 404 returned error can't find the container with id 9679816377a85b4cd8c815c6f3b0a18985b9d171904b5aabce11112278f39f00 Apr 21 06:28:27.545401 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.545314 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qqfpq" event={"ID":"0046ecc4-fbf2-441f-a0a9-c7b79d713ced","Type":"ContainerStarted","Data":"e6518d677a3d622f119b2119070d7e0f10c16d162cbd378ee0eb6ff4f8e66b24"} Apr 21 06:28:27.545401 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.545361 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qqfpq" event={"ID":"0046ecc4-fbf2-441f-a0a9-c7b79d713ced","Type":"ContainerStarted","Data":"88dbf403111fa6105a805a1c6eb66c7e4c35aeab97fae27e3579f08a5225a267"} Apr 21 06:28:27.546765 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.546731 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59f06050-163a-4720-b0dd-b3a1b905e054","Type":"ContainerStarted","Data":"b3c55945061f480b099f673083cbabf6eca711c3e41dd01e8777c9d17cc757e5"} Apr 21 06:28:27.547784 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.547760 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" event={"ID":"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b","Type":"ContainerStarted","Data":"9679816377a85b4cd8c815c6f3b0a18985b9d171904b5aabce11112278f39f00"} Apr 21 06:28:27.563457 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:27.563405 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qqfpq" podStartSLOduration=2.904182275 podStartE2EDuration="3.56339185s" podCreationTimestamp="2026-04-21 06:28:24 +0000 UTC" firstStartedPulling="2026-04-21 06:28:25.599126342 +0000 UTC m=+143.105591236" lastFinishedPulling="2026-04-21 06:28:26.258335917 +0000 UTC m=+143.764800811" observedRunningTime="2026-04-21 06:28:27.561878959 +0000 UTC m=+145.068343874" watchObservedRunningTime="2026-04-21 06:28:27.56339185 +0000 UTC m=+145.069856762" Apr 21 06:28:28.519928 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:28.519888 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" podUID="ededebba-2d69-4d19-b62a-9a453f81d8d3" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 06:28:28.551781 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:28.551746 2570 generic.go:358] "Generic (PLEG): container finished" podID="59f06050-163a-4720-b0dd-b3a1b905e054" containerID="2f5ffb795fed5ab4c61a837bc7fdf0cdc12f8554660a812c16ceeb9fc0987649" exitCode=0 Apr 21 06:28:28.551972 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:28.551838 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59f06050-163a-4720-b0dd-b3a1b905e054","Type":"ContainerDied","Data":"2f5ffb795fed5ab4c61a837bc7fdf0cdc12f8554660a812c16ceeb9fc0987649"} Apr 21 06:28:29.433028 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:29.432988 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-mrrfx"] Apr 21 06:28:29.436503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:29.436479 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mrrfx" Apr 21 06:28:29.438454 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:29.438429 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 06:28:29.438574 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:29.438434 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-chtx7\"" Apr 21 06:28:29.443537 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:29.443510 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-mrrfx"] Apr 21 06:28:29.548670 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:29.548628 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4d3c5ba6-9729-40ec-8881-cff62bfb8bb3-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-mrrfx\" (UID: \"4d3c5ba6-9729-40ec-8881-cff62bfb8bb3\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mrrfx" Apr 21 06:28:29.557827 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:29.557794 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" event={"ID":"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b","Type":"ContainerStarted","Data":"75e5e57bf4d07fd7ad9441a434ad6f0ec86c87f8449b1cc9d37c4c9760d50482"} Apr 21 06:28:29.557971 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:29.557837 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" event={"ID":"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b","Type":"ContainerStarted","Data":"db0eeff367f3547d3557f388977a2ae6513abd93c111912142b98a0d563f41ed"} Apr 21 06:28:29.557971 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:29.557870 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" event={"ID":"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b","Type":"ContainerStarted","Data":"b3102f8596ffa108768836e31c213f693575e7314b5ff8bcfad7f26129c4cb10"} Apr 21 06:28:29.650092 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:29.650039 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4d3c5ba6-9729-40ec-8881-cff62bfb8bb3-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-mrrfx\" (UID: \"4d3c5ba6-9729-40ec-8881-cff62bfb8bb3\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mrrfx" Apr 21 06:28:29.650270 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:28:29.650205 2570 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 06:28:29.650333 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:28:29.650283 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d3c5ba6-9729-40ec-8881-cff62bfb8bb3-monitoring-plugin-cert podName:4d3c5ba6-9729-40ec-8881-cff62bfb8bb3 nodeName:}" failed. No retries permitted until 2026-04-21 06:28:30.150261965 +0000 UTC m=+147.656726862 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/4d3c5ba6-9729-40ec-8881-cff62bfb8bb3-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-mrrfx" (UID: "4d3c5ba6-9729-40ec-8881-cff62bfb8bb3") : secret "monitoring-plugin-cert" not found Apr 21 06:28:30.153442 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.153411 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4d3c5ba6-9729-40ec-8881-cff62bfb8bb3-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-mrrfx\" (UID: \"4d3c5ba6-9729-40ec-8881-cff62bfb8bb3\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mrrfx" Apr 21 06:28:30.155463 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.155445 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4d3c5ba6-9729-40ec-8881-cff62bfb8bb3-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-mrrfx\" (UID: \"4d3c5ba6-9729-40ec-8881-cff62bfb8bb3\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mrrfx" Apr 21 06:28:30.347709 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.347670 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mrrfx" Apr 21 06:28:30.464095 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.464059 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-mrrfx"] Apr 21 06:28:30.466775 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:28:30.466748 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d3c5ba6_9729_40ec_8881_cff62bfb8bb3.slice/crio-ad9de68175bf7722d33916e212206da689e5313db21c7903d634f697d8ac4788 WatchSource:0}: Error finding container ad9de68175bf7722d33916e212206da689e5313db21c7903d634f697d8ac4788: Status 404 returned error can't find the container with id ad9de68175bf7722d33916e212206da689e5313db21c7903d634f697d8ac4788 Apr 21 06:28:30.563042 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.562951 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" event={"ID":"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b","Type":"ContainerStarted","Data":"32e060788de384efa6af450898b6926df26584566ee75c3f4f93edf0043690b7"} Apr 21 06:28:30.563042 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.562990 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" event={"ID":"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b","Type":"ContainerStarted","Data":"a58d58050731fcbfa8c576aa3b2d7a5e32e1e51f5201a538565156db031d1f43"} Apr 21 06:28:30.563042 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.563002 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" event={"ID":"3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b","Type":"ContainerStarted","Data":"1ae3a83bc33fa42a0299b54bd7a0293ed34bdb7750127b64a1f002bb1edc2413"} Apr 21 06:28:30.563539 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.563120 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:30.565697 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.565672 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59f06050-163a-4720-b0dd-b3a1b905e054","Type":"ContainerStarted","Data":"c539ac08fbbb723fdb8627b2d1c79b80ccbdb609d05b3d9c2170514459fd9225"} Apr 21 06:28:30.565795 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.565702 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59f06050-163a-4720-b0dd-b3a1b905e054","Type":"ContainerStarted","Data":"35cbfcd8b6950303df79e01cf875bdd60f4b12ad25bf75cc13b54070b2860397"} Apr 21 06:28:30.565795 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.565718 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59f06050-163a-4720-b0dd-b3a1b905e054","Type":"ContainerStarted","Data":"2702324f055405fe47a0addb7c2aeda296f89b2f67691c4881a4731370951bbd"} Apr 21 06:28:30.565795 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.565731 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59f06050-163a-4720-b0dd-b3a1b905e054","Type":"ContainerStarted","Data":"95cf2d091814b5d9f7ab3caa73e0254f127d65af2e4d562d89570f84567b546d"} Apr 21 06:28:30.565795 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.565742 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59f06050-163a-4720-b0dd-b3a1b905e054","Type":"ContainerStarted","Data":"9935e53142832c96f914fe58e16a8a662349a1628738d4b11c71f97d5f085679"} Apr 21 06:28:30.565795 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.565753 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59f06050-163a-4720-b0dd-b3a1b905e054","Type":"ContainerStarted","Data":"41f0e82dcc2b1df520e6fe446908c94848800ace40d9efaca27426649ea311a6"} Apr 21 06:28:30.566728 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.566706 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mrrfx" event={"ID":"4d3c5ba6-9729-40ec-8881-cff62bfb8bb3","Type":"ContainerStarted","Data":"ad9de68175bf7722d33916e212206da689e5313db21c7903d634f697d8ac4788"} Apr 21 06:28:30.583795 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.583758 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" podStartSLOduration=1.8237202350000001 podStartE2EDuration="4.583745598s" podCreationTimestamp="2026-04-21 06:28:26 +0000 UTC" firstStartedPulling="2026-04-21 06:28:27.262149039 +0000 UTC m=+144.768613932" lastFinishedPulling="2026-04-21 06:28:30.022174404 +0000 UTC m=+147.528639295" observedRunningTime="2026-04-21 06:28:30.582210403 +0000 UTC m=+148.088675319" watchObservedRunningTime="2026-04-21 06:28:30.583745598 +0000 UTC m=+148.090210511" Apr 21 06:28:30.605795 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.605622 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.727515206 podStartE2EDuration="5.605608728s" podCreationTimestamp="2026-04-21 06:28:25 +0000 UTC" firstStartedPulling="2026-04-21 06:28:26.832235413 +0000 UTC m=+144.338700305" lastFinishedPulling="2026-04-21 06:28:29.71032893 +0000 UTC m=+147.216793827" observedRunningTime="2026-04-21 06:28:30.604396704 +0000 UTC m=+148.110861616" watchObservedRunningTime="2026-04-21 06:28:30.605608728 +0000 UTC m=+148.112073642" Apr 21 06:28:30.888090 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.888051 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 06:28:30.892242 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.892217 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:30.894594 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.894567 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-9c34ncig7c1ms\"" Apr 21 06:28:30.894715 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.894593 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 06:28:30.895074 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.895049 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-v59qm\"" Apr 21 06:28:30.895074 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.895067 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 06:28:30.895240 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.895145 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 06:28:30.895647 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.895531 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 06:28:30.895758 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.895699 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 06:28:30.895817 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.895797 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 06:28:30.895892 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.895882 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 06:28:30.895953 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.895893 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 06:28:30.895953 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.895801 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 06:28:30.896047 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.895884 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 06:28:30.896100 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.896051 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 06:28:30.896203 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.896140 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 06:28:30.898007 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.897986 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 06:28:30.904342 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.904316 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 06:28:30.962340 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.962302 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-config\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:30.962546 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.962350 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6e934c6a-c383-4e26-9283-e31dd7b3c42c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:30.962546 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.962375 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:30.962546 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.962453 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:30.962546 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.962493 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:30.962546 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.962524 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:30.962795 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.962552 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:30.962795 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.962613 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:30.962795 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.962702 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6e934c6a-c383-4e26-9283-e31dd7b3c42c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:30.962795 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.962743 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-web-config\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:30.962795 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.962778 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8xs5\" (UniqueName: \"kubernetes.io/projected/6e934c6a-c383-4e26-9283-e31dd7b3c42c-kube-api-access-s8xs5\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:30.963054 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.962806 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:30.963054 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.962870 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:30.963054 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.962903 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:30.963054 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.962936 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6e934c6a-c383-4e26-9283-e31dd7b3c42c-config-out\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:30.963054 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.962972 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:30.963320 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.963061 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:30.963320 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:30.963089 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.064178 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.064141 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.064380 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.064186 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.064380 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.064222 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6e934c6a-c383-4e26-9283-e31dd7b3c42c-config-out\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.064380 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.064250 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.064380 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.064294 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.064380 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.064318 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.064646 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.064391 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-config\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.064646 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.064415 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6e934c6a-c383-4e26-9283-e31dd7b3c42c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.064646 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.064444 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.064646 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.064472 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.064646 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.064499 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.064646 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.064522 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.064646 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.064548 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.064646 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.064579 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.064646 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.064615 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6e934c6a-c383-4e26-9283-e31dd7b3c42c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.065122 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.064957 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6e934c6a-c383-4e26-9283-e31dd7b3c42c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.065629 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.065229 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.065629 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.065463 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-web-config\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.065629 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.065527 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8xs5\" (UniqueName: \"kubernetes.io/projected/6e934c6a-c383-4e26-9283-e31dd7b3c42c-kube-api-access-s8xs5\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.066001 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.065739 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.066136 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.066075 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.069087 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.069064 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.069642 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.069609 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.069928 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.069904 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6e934c6a-c383-4e26-9283-e31dd7b3c42c-config-out\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.070123 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.070099 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.070461 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.070435 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.070979 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.070953 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.071117 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.071093 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.071297 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.071273 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.071810 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.071504 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.071810 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.071769 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-web-config\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.072159 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.072131 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-config\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.072664 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.072623 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.073070 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.073049 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.073503 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.073486 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6e934c6a-c383-4e26-9283-e31dd7b3c42c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.075789 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.075772 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8xs5\" (UniqueName: \"kubernetes.io/projected/6e934c6a-c383-4e26-9283-e31dd7b3c42c-kube-api-access-s8xs5\") pod \"prometheus-k8s-0\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.204603 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.204508 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:31.559610 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.559578 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-b5976c5f8-z5cl9"] Apr 21 06:28:31.559845 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:28:31.559825 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" podUID="19cbf706-822a-4927-b18a-0621751d560e" Apr 21 06:28:31.569840 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.569816 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:28:31.575179 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.575159 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:28:31.586347 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.586319 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 06:28:31.589282 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:28:31.589256 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e934c6a_c383_4e26_9283_e31dd7b3c42c.slice/crio-8425844536f389c57e2fa166e63384c1bf12bed020d80aa4fa94f7e52bd11c26 WatchSource:0}: Error finding container 8425844536f389c57e2fa166e63384c1bf12bed020d80aa4fa94f7e52bd11c26: Status 404 returned error can't find the container with id 8425844536f389c57e2fa166e63384c1bf12bed020d80aa4fa94f7e52bd11c26 Apr 21 06:28:31.670378 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.670339 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/19cbf706-822a-4927-b18a-0621751d560e-image-registry-private-configuration\") pod \"19cbf706-822a-4927-b18a-0621751d560e\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " Apr 21 06:28:31.670539 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.670446 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19cbf706-822a-4927-b18a-0621751d560e-ca-trust-extracted\") pod \"19cbf706-822a-4927-b18a-0621751d560e\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " Apr 21 06:28:31.670539 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.670486 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19cbf706-822a-4927-b18a-0621751d560e-trusted-ca\") pod \"19cbf706-822a-4927-b18a-0621751d560e\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " Apr 21 06:28:31.670539 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.670536 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v5fr\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-kube-api-access-2v5fr\") pod \"19cbf706-822a-4927-b18a-0621751d560e\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " Apr 21 06:28:31.670694 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.670570 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19cbf706-822a-4927-b18a-0621751d560e-registry-certificates\") pod \"19cbf706-822a-4927-b18a-0621751d560e\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " Apr 21 06:28:31.670694 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.670596 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-bound-sa-token\") pod \"19cbf706-822a-4927-b18a-0621751d560e\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " Apr 21 06:28:31.670694 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.670626 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19cbf706-822a-4927-b18a-0621751d560e-installation-pull-secrets\") pod \"19cbf706-822a-4927-b18a-0621751d560e\" (UID: \"19cbf706-822a-4927-b18a-0621751d560e\") " Apr 21 06:28:31.670694 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.670679 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19cbf706-822a-4927-b18a-0621751d560e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "19cbf706-822a-4927-b18a-0621751d560e" (UID: "19cbf706-822a-4927-b18a-0621751d560e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 06:28:31.671763 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.671268 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19cbf706-822a-4927-b18a-0621751d560e-ca-trust-extracted\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:28:31.671763 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.671468 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19cbf706-822a-4927-b18a-0621751d560e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "19cbf706-822a-4927-b18a-0621751d560e" (UID: "19cbf706-822a-4927-b18a-0621751d560e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:28:31.671763 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.671720 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19cbf706-822a-4927-b18a-0621751d560e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "19cbf706-822a-4927-b18a-0621751d560e" (UID: "19cbf706-822a-4927-b18a-0621751d560e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:28:31.673216 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.673190 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cbf706-822a-4927-b18a-0621751d560e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "19cbf706-822a-4927-b18a-0621751d560e" (UID: "19cbf706-822a-4927-b18a-0621751d560e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:28:31.673698 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.673676 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "19cbf706-822a-4927-b18a-0621751d560e" (UID: "19cbf706-822a-4927-b18a-0621751d560e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:28:31.673764 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.673748 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cbf706-822a-4927-b18a-0621751d560e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "19cbf706-822a-4927-b18a-0621751d560e" (UID: "19cbf706-822a-4927-b18a-0621751d560e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:28:31.674171 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.674155 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-kube-api-access-2v5fr" (OuterVolumeSpecName: "kube-api-access-2v5fr") pod "19cbf706-822a-4927-b18a-0621751d560e" (UID: "19cbf706-822a-4927-b18a-0621751d560e"). InnerVolumeSpecName "kube-api-access-2v5fr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:28:31.771739 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.771646 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19cbf706-822a-4927-b18a-0621751d560e-trusted-ca\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:28:31.771739 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.771681 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2v5fr\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-kube-api-access-2v5fr\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:28:31.771739 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.771695 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19cbf706-822a-4927-b18a-0621751d560e-registry-certificates\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:28:31.771739 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.771708 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-bound-sa-token\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:28:31.771739 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.771721 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19cbf706-822a-4927-b18a-0621751d560e-installation-pull-secrets\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:28:31.771739 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:31.771733 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/19cbf706-822a-4927-b18a-0621751d560e-image-registry-private-configuration\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:28:32.573549 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:32.573515 2570 generic.go:358] "Generic (PLEG): container finished" podID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerID="0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f" exitCode=0 Apr 21 06:28:32.573984 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:32.573600 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e934c6a-c383-4e26-9283-e31dd7b3c42c","Type":"ContainerDied","Data":"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f"} Apr 21 06:28:32.573984 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:32.573633 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e934c6a-c383-4e26-9283-e31dd7b3c42c","Type":"ContainerStarted","Data":"8425844536f389c57e2fa166e63384c1bf12bed020d80aa4fa94f7e52bd11c26"} Apr 21 06:28:32.574983 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:32.574957 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b5976c5f8-z5cl9" Apr 21 06:28:32.574983 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:32.574976 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mrrfx" event={"ID":"4d3c5ba6-9729-40ec-8881-cff62bfb8bb3","Type":"ContainerStarted","Data":"75a1525179fc9786549e068ba052f031d380fe4f55f9814a3a11bdabda9e177c"} Apr 21 06:28:32.575379 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:32.575358 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mrrfx" Apr 21 06:28:32.580289 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:32.580272 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mrrfx" Apr 21 06:28:32.623038 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:32.623005 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-b5976c5f8-z5cl9"] Apr 21 06:28:32.626430 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:32.626404 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-b5976c5f8-z5cl9"] Apr 21 06:28:32.638387 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:32.638336 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-mrrfx" podStartSLOduration=2.601241586 podStartE2EDuration="3.638320758s" podCreationTimestamp="2026-04-21 06:28:29 +0000 UTC" firstStartedPulling="2026-04-21 06:28:30.468539018 +0000 UTC m=+147.975003910" lastFinishedPulling="2026-04-21 06:28:31.505618181 +0000 UTC m=+149.012083082" observedRunningTime="2026-04-21 06:28:32.637433315 +0000 UTC m=+150.143898228" watchObservedRunningTime="2026-04-21 06:28:32.638320758 +0000 UTC m=+150.144785671" Apr 21 06:28:32.679500 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:32.679273 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19cbf706-822a-4927-b18a-0621751d560e-registry-tls\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:28:33.083923 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:33.083889 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19cbf706-822a-4927-b18a-0621751d560e" path="/var/lib/kubelet/pods/19cbf706-822a-4927-b18a-0621751d560e/volumes" Apr 21 06:28:35.585751 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:35.585676 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e934c6a-c383-4e26-9283-e31dd7b3c42c","Type":"ContainerStarted","Data":"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d"} Apr 21 06:28:35.585751 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:35.585708 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e934c6a-c383-4e26-9283-e31dd7b3c42c","Type":"ContainerStarted","Data":"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c"} Apr 21 06:28:35.585751 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:35.585722 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e934c6a-c383-4e26-9283-e31dd7b3c42c","Type":"ContainerStarted","Data":"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a"} Apr 21 06:28:35.585751 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:35.585731 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e934c6a-c383-4e26-9283-e31dd7b3c42c","Type":"ContainerStarted","Data":"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5"} Apr 21 06:28:35.585751 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:35.585740 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e934c6a-c383-4e26-9283-e31dd7b3c42c","Type":"ContainerStarted","Data":"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699"} Apr 21 06:28:35.585751 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:35.585748 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e934c6a-c383-4e26-9283-e31dd7b3c42c","Type":"ContainerStarted","Data":"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb"} Apr 21 06:28:35.613824 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:35.613776 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.251194777 podStartE2EDuration="5.613760889s" podCreationTimestamp="2026-04-21 06:28:30 +0000 UTC" firstStartedPulling="2026-04-21 06:28:32.574823637 +0000 UTC m=+150.081288528" lastFinishedPulling="2026-04-21 06:28:34.937389746 +0000 UTC m=+152.443854640" observedRunningTime="2026-04-21 06:28:35.613235444 +0000 UTC m=+153.119700378" watchObservedRunningTime="2026-04-21 06:28:35.613760889 +0000 UTC m=+153.120225802" Apr 21 06:28:36.205374 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:36.205338 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:28:36.576160 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:36.576086 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-959989d7d-v9t94" Apr 21 06:28:38.444922 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:28:38.444874 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-xc224" podUID="e3794d28-61ca-4d8d-9d47-c634fc191844" Apr 21 06:28:38.464009 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:28:38.463981 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-2pzsh" podUID="5a6936ad-93fd-4f26-83cc-7a94f1ebcac9" Apr 21 06:28:38.520015 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:38.519980 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" podUID="ededebba-2d69-4d19-b62a-9a453f81d8d3" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 06:28:38.520153 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:38.520058 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" Apr 21 06:28:38.520568 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:38.520537 2570 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"eb722fbac4d7b25eaba087c777af00ced3f8864ac85126e053a768c06fa9ad33"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 21 06:28:38.520626 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:38.520587 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" podUID="ededebba-2d69-4d19-b62a-9a453f81d8d3" containerName="service-proxy" containerID="cri-o://eb722fbac4d7b25eaba087c777af00ced3f8864ac85126e053a768c06fa9ad33" gracePeriod=30 Apr 21 06:28:38.596744 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:38.596717 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2pzsh" Apr 21 06:28:38.596909 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:38.596719 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xc224" Apr 21 06:28:39.601240 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:39.601199 2570 generic.go:358] "Generic (PLEG): container finished" podID="ededebba-2d69-4d19-b62a-9a453f81d8d3" containerID="eb722fbac4d7b25eaba087c777af00ced3f8864ac85126e053a768c06fa9ad33" exitCode=2 Apr 21 06:28:39.601637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:39.601255 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" event={"ID":"ededebba-2d69-4d19-b62a-9a453f81d8d3","Type":"ContainerDied","Data":"eb722fbac4d7b25eaba087c777af00ced3f8864ac85126e053a768c06fa9ad33"} Apr 21 06:28:39.601637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:39.601280 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b7696c4bd-vp5qw" event={"ID":"ededebba-2d69-4d19-b62a-9a453f81d8d3","Type":"ContainerStarted","Data":"4b64587d276e622ad520a25a183d880beeb26efe0ba55160d26ed5245211dc3f"} Apr 21 06:28:43.378157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:43.378102 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert\") pod \"ingress-canary-2pzsh\" (UID: \"5a6936ad-93fd-4f26-83cc-7a94f1ebcac9\") " pod="openshift-ingress-canary/ingress-canary-2pzsh" Apr 21 06:28:43.378157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:43.378171 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:28:43.380500 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:43.380478 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e3794d28-61ca-4d8d-9d47-c634fc191844-metrics-tls\") pod \"dns-default-xc224\" (UID: \"e3794d28-61ca-4d8d-9d47-c634fc191844\") " pod="openshift-dns/dns-default-xc224" Apr 21 06:28:43.380918 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:43.380901 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a6936ad-93fd-4f26-83cc-7a94f1ebcac9-cert\") pod \"ingress-canary-2pzsh\" (UID: \"5a6936ad-93fd-4f26-83cc-7a94f1ebcac9\") " pod="openshift-ingress-canary/ingress-canary-2pzsh" Apr 21 06:28:43.399843 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:43.399813 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-98jt7\"" Apr 21 06:28:43.400398 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:43.400382 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gz466\"" Apr 21 06:28:43.408863 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:43.408838 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2pzsh" Apr 21 06:28:43.408968 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:43.408952 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xc224" Apr 21 06:28:43.533358 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:43.533185 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xc224"] Apr 21 06:28:43.536110 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:28:43.536081 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3794d28_61ca_4d8d_9d47_c634fc191844.slice/crio-036076cc7fc920d778e40f9b3c9eb4c94a7803a2e1b86e58bf6a371dcc62ae3c WatchSource:0}: Error finding container 036076cc7fc920d778e40f9b3c9eb4c94a7803a2e1b86e58bf6a371dcc62ae3c: Status 404 returned error can't find the container with id 036076cc7fc920d778e40f9b3c9eb4c94a7803a2e1b86e58bf6a371dcc62ae3c Apr 21 06:28:43.546698 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:43.546635 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2pzsh"] Apr 21 06:28:43.549666 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:28:43.549645 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a6936ad_93fd_4f26_83cc_7a94f1ebcac9.slice/crio-1c2265e24dedf88c0effda38d7e35a6aab4e6ea14b93af7873466e53c1e1941c WatchSource:0}: Error finding container 1c2265e24dedf88c0effda38d7e35a6aab4e6ea14b93af7873466e53c1e1941c: Status 404 returned error can't find the container with id 1c2265e24dedf88c0effda38d7e35a6aab4e6ea14b93af7873466e53c1e1941c Apr 21 06:28:43.617099 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:43.617065 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xc224" event={"ID":"e3794d28-61ca-4d8d-9d47-c634fc191844","Type":"ContainerStarted","Data":"036076cc7fc920d778e40f9b3c9eb4c94a7803a2e1b86e58bf6a371dcc62ae3c"} Apr 21 06:28:43.617970 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:43.617948 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2pzsh" event={"ID":"5a6936ad-93fd-4f26-83cc-7a94f1ebcac9","Type":"ContainerStarted","Data":"1c2265e24dedf88c0effda38d7e35a6aab4e6ea14b93af7873466e53c1e1941c"} Apr 21 06:28:45.625792 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:45.625756 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2pzsh" event={"ID":"5a6936ad-93fd-4f26-83cc-7a94f1ebcac9","Type":"ContainerStarted","Data":"0fdeed98f5aa81f18d778e04239e81f2e22922bd4a20e3cf56c5ec14f22eb0d8"} Apr 21 06:28:45.627346 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:45.627321 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xc224" event={"ID":"e3794d28-61ca-4d8d-9d47-c634fc191844","Type":"ContainerStarted","Data":"e65345f82720fee72d6d485da5380879749bde092d530eb595239acc8b3a9dec"} Apr 21 06:28:45.627346 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:45.627348 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xc224" event={"ID":"e3794d28-61ca-4d8d-9d47-c634fc191844","Type":"ContainerStarted","Data":"ad5b13491f6544c19a74404b6f4dc5181f79accd7f6990ac8f8154080a06b24f"} Apr 21 06:28:45.627588 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:45.627377 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-xc224" Apr 21 06:28:45.641150 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:45.641099 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2pzsh" podStartSLOduration=128.84823945 podStartE2EDuration="2m10.641085605s" podCreationTimestamp="2026-04-21 06:26:35 +0000 UTC" firstStartedPulling="2026-04-21 06:28:43.551543766 +0000 UTC m=+161.058008658" lastFinishedPulling="2026-04-21 06:28:45.344389922 +0000 UTC m=+162.850854813" observedRunningTime="2026-04-21 06:28:45.63988995 +0000 UTC m=+163.146354862" watchObservedRunningTime="2026-04-21 06:28:45.641085605 +0000 UTC m=+163.147550519" Apr 21 06:28:45.658428 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:45.658381 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xc224" podStartSLOduration=128.856272366 podStartE2EDuration="2m10.658360868s" podCreationTimestamp="2026-04-21 06:26:35 +0000 UTC" firstStartedPulling="2026-04-21 06:28:43.538475822 +0000 UTC m=+161.044940716" lastFinishedPulling="2026-04-21 06:28:45.340564323 +0000 UTC m=+162.847029218" observedRunningTime="2026-04-21 06:28:45.657567292 +0000 UTC m=+163.164032205" watchObservedRunningTime="2026-04-21 06:28:45.658360868 +0000 UTC m=+163.164825782" Apr 21 06:28:55.632812 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:28:55.632735 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xc224" Apr 21 06:29:03.111908 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:03.111876 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2pzsh_5a6936ad-93fd-4f26-83cc-7a94f1ebcac9/serve-healthcheck-canary/0.log" Apr 21 06:29:31.205013 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:31.204965 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:31.224373 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:31.224342 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:31.773800 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:31.773767 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:44.962414 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:44.962378 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 06:29:44.963030 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:44.962969 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="alertmanager" containerID="cri-o://41f0e82dcc2b1df520e6fe446908c94848800ace40d9efaca27426649ea311a6" gracePeriod=120 Apr 21 06:29:44.963242 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:44.963022 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="kube-rbac-proxy-metric" containerID="cri-o://35cbfcd8b6950303df79e01cf875bdd60f4b12ad25bf75cc13b54070b2860397" gracePeriod=120 Apr 21 06:29:44.963242 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:44.963054 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="kube-rbac-proxy-web" containerID="cri-o://95cf2d091814b5d9f7ab3caa73e0254f127d65af2e4d562d89570f84567b546d" gracePeriod=120 Apr 21 06:29:44.963242 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:44.963071 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="kube-rbac-proxy" containerID="cri-o://2702324f055405fe47a0addb7c2aeda296f89b2f67691c4881a4731370951bbd" gracePeriod=120 Apr 21 06:29:44.963242 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:44.963088 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="config-reloader" containerID="cri-o://9935e53142832c96f914fe58e16a8a662349a1628738d4b11c71f97d5f085679" gracePeriod=120 Apr 21 06:29:44.963242 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:44.963145 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="prom-label-proxy" containerID="cri-o://c539ac08fbbb723fdb8627b2d1c79b80ccbdb609d05b3d9c2170514459fd9225" gracePeriod=120 Apr 21 06:29:45.800935 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:45.800882 2570 generic.go:358] "Generic (PLEG): container finished" podID="59f06050-163a-4720-b0dd-b3a1b905e054" containerID="c539ac08fbbb723fdb8627b2d1c79b80ccbdb609d05b3d9c2170514459fd9225" exitCode=0 Apr 21 06:29:45.800935 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:45.800923 2570 generic.go:358] "Generic (PLEG): container finished" podID="59f06050-163a-4720-b0dd-b3a1b905e054" containerID="35cbfcd8b6950303df79e01cf875bdd60f4b12ad25bf75cc13b54070b2860397" exitCode=0 Apr 21 06:29:45.800935 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:45.800932 2570 generic.go:358] "Generic (PLEG): container finished" podID="59f06050-163a-4720-b0dd-b3a1b905e054" containerID="2702324f055405fe47a0addb7c2aeda296f89b2f67691c4881a4731370951bbd" exitCode=0 Apr 21 06:29:45.800935 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:45.800939 2570 generic.go:358] "Generic (PLEG): container finished" podID="59f06050-163a-4720-b0dd-b3a1b905e054" containerID="9935e53142832c96f914fe58e16a8a662349a1628738d4b11c71f97d5f085679" exitCode=0 Apr 21 06:29:45.801195 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:45.800948 2570 generic.go:358] "Generic (PLEG): container finished" podID="59f06050-163a-4720-b0dd-b3a1b905e054" containerID="41f0e82dcc2b1df520e6fe446908c94848800ace40d9efaca27426649ea311a6" exitCode=0 Apr 21 06:29:45.801195 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:45.800957 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59f06050-163a-4720-b0dd-b3a1b905e054","Type":"ContainerDied","Data":"c539ac08fbbb723fdb8627b2d1c79b80ccbdb609d05b3d9c2170514459fd9225"} Apr 21 06:29:45.801195 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:45.800994 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59f06050-163a-4720-b0dd-b3a1b905e054","Type":"ContainerDied","Data":"35cbfcd8b6950303df79e01cf875bdd60f4b12ad25bf75cc13b54070b2860397"} Apr 21 06:29:45.801195 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:45.801005 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59f06050-163a-4720-b0dd-b3a1b905e054","Type":"ContainerDied","Data":"2702324f055405fe47a0addb7c2aeda296f89b2f67691c4881a4731370951bbd"} Apr 21 06:29:45.801195 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:45.801015 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59f06050-163a-4720-b0dd-b3a1b905e054","Type":"ContainerDied","Data":"9935e53142832c96f914fe58e16a8a662349a1628738d4b11c71f97d5f085679"} Apr 21 06:29:45.801195 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:45.801024 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59f06050-163a-4720-b0dd-b3a1b905e054","Type":"ContainerDied","Data":"41f0e82dcc2b1df520e6fe446908c94848800ace40d9efaca27426649ea311a6"} Apr 21 06:29:46.196357 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.196334 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:46.259697 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.259668 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/59f06050-163a-4720-b0dd-b3a1b905e054-tls-assets\") pod \"59f06050-163a-4720-b0dd-b3a1b905e054\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " Apr 21 06:29:46.259697 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.259704 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59f06050-163a-4720-b0dd-b3a1b905e054-metrics-client-ca\") pod \"59f06050-163a-4720-b0dd-b3a1b905e054\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " Apr 21 06:29:46.260317 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.259725 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-kube-rbac-proxy-metric\") pod \"59f06050-163a-4720-b0dd-b3a1b905e054\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " Apr 21 06:29:46.260317 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.259749 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-kube-rbac-proxy-web\") pod \"59f06050-163a-4720-b0dd-b3a1b905e054\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " Apr 21 06:29:46.260317 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.259769 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-web-config\") pod \"59f06050-163a-4720-b0dd-b3a1b905e054\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " Apr 21 06:29:46.260317 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.259790 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-main-tls\") pod \"59f06050-163a-4720-b0dd-b3a1b905e054\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " Apr 21 06:29:46.260317 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.259806 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts7g7\" (UniqueName: \"kubernetes.io/projected/59f06050-163a-4720-b0dd-b3a1b905e054-kube-api-access-ts7g7\") pod \"59f06050-163a-4720-b0dd-b3a1b905e054\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " Apr 21 06:29:46.260317 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.259832 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-kube-rbac-proxy\") pod \"59f06050-163a-4720-b0dd-b3a1b905e054\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " Apr 21 06:29:46.260317 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.259872 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-cluster-tls-config\") pod \"59f06050-163a-4720-b0dd-b3a1b905e054\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " Apr 21 06:29:46.260317 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.259913 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/59f06050-163a-4720-b0dd-b3a1b905e054-alertmanager-main-db\") pod \"59f06050-163a-4720-b0dd-b3a1b905e054\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " Apr 21 06:29:46.260317 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.259969 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-config-volume\") pod \"59f06050-163a-4720-b0dd-b3a1b905e054\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " Apr 21 06:29:46.260317 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.260002 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/59f06050-163a-4720-b0dd-b3a1b905e054-config-out\") pod \"59f06050-163a-4720-b0dd-b3a1b905e054\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " Apr 21 06:29:46.260317 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.260029 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59f06050-163a-4720-b0dd-b3a1b905e054-alertmanager-trusted-ca-bundle\") pod \"59f06050-163a-4720-b0dd-b3a1b905e054\" (UID: \"59f06050-163a-4720-b0dd-b3a1b905e054\") " Apr 21 06:29:46.260317 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.260149 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59f06050-163a-4720-b0dd-b3a1b905e054-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "59f06050-163a-4720-b0dd-b3a1b905e054" (UID: "59f06050-163a-4720-b0dd-b3a1b905e054"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:29:46.260317 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.260282 2570 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59f06050-163a-4720-b0dd-b3a1b905e054-metrics-client-ca\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:46.261096 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.260569 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59f06050-163a-4720-b0dd-b3a1b905e054-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "59f06050-163a-4720-b0dd-b3a1b905e054" (UID: "59f06050-163a-4720-b0dd-b3a1b905e054"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:29:46.261716 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.261691 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59f06050-163a-4720-b0dd-b3a1b905e054-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "59f06050-163a-4720-b0dd-b3a1b905e054" (UID: "59f06050-163a-4720-b0dd-b3a1b905e054"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 06:29:46.263294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.263253 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "59f06050-163a-4720-b0dd-b3a1b905e054" (UID: "59f06050-163a-4720-b0dd-b3a1b905e054"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:29:46.263294 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.263267 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59f06050-163a-4720-b0dd-b3a1b905e054-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "59f06050-163a-4720-b0dd-b3a1b905e054" (UID: "59f06050-163a-4720-b0dd-b3a1b905e054"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:29:46.263759 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.263733 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "59f06050-163a-4720-b0dd-b3a1b905e054" (UID: "59f06050-163a-4720-b0dd-b3a1b905e054"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:29:46.264135 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.264093 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "59f06050-163a-4720-b0dd-b3a1b905e054" (UID: "59f06050-163a-4720-b0dd-b3a1b905e054"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:29:46.264387 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.264357 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "59f06050-163a-4720-b0dd-b3a1b905e054" (UID: "59f06050-163a-4720-b0dd-b3a1b905e054"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:29:46.264475 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.264385 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59f06050-163a-4720-b0dd-b3a1b905e054-kube-api-access-ts7g7" (OuterVolumeSpecName: "kube-api-access-ts7g7") pod "59f06050-163a-4720-b0dd-b3a1b905e054" (UID: "59f06050-163a-4720-b0dd-b3a1b905e054"). InnerVolumeSpecName "kube-api-access-ts7g7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:29:46.264717 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.264697 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-config-volume" (OuterVolumeSpecName: "config-volume") pod "59f06050-163a-4720-b0dd-b3a1b905e054" (UID: "59f06050-163a-4720-b0dd-b3a1b905e054"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:29:46.265207 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.265185 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59f06050-163a-4720-b0dd-b3a1b905e054-config-out" (OuterVolumeSpecName: "config-out") pod "59f06050-163a-4720-b0dd-b3a1b905e054" (UID: "59f06050-163a-4720-b0dd-b3a1b905e054"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 06:29:46.268085 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.268063 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "59f06050-163a-4720-b0dd-b3a1b905e054" (UID: "59f06050-163a-4720-b0dd-b3a1b905e054"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:29:46.273775 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.273753 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-web-config" (OuterVolumeSpecName: "web-config") pod "59f06050-163a-4720-b0dd-b3a1b905e054" (UID: "59f06050-163a-4720-b0dd-b3a1b905e054"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:29:46.361138 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.361043 2570 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-config-volume\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:46.361138 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.361077 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/59f06050-163a-4720-b0dd-b3a1b905e054-config-out\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:46.361138 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.361088 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59f06050-163a-4720-b0dd-b3a1b905e054-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:46.361138 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.361097 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/59f06050-163a-4720-b0dd-b3a1b905e054-tls-assets\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:46.361138 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.361106 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:46.361138 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.361117 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:46.361138 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.361127 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-web-config\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:46.361138 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.361136 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-main-tls\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:46.361138 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.361146 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ts7g7\" (UniqueName: \"kubernetes.io/projected/59f06050-163a-4720-b0dd-b3a1b905e054-kube-api-access-ts7g7\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:46.361504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.361155 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:46.361504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.361165 2570 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/59f06050-163a-4720-b0dd-b3a1b905e054-cluster-tls-config\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:46.361504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.361175 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/59f06050-163a-4720-b0dd-b3a1b905e054-alertmanager-main-db\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:46.806298 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.806263 2570 generic.go:358] "Generic (PLEG): container finished" podID="59f06050-163a-4720-b0dd-b3a1b905e054" containerID="95cf2d091814b5d9f7ab3caa73e0254f127d65af2e4d562d89570f84567b546d" exitCode=0 Apr 21 06:29:46.806459 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.806330 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59f06050-163a-4720-b0dd-b3a1b905e054","Type":"ContainerDied","Data":"95cf2d091814b5d9f7ab3caa73e0254f127d65af2e4d562d89570f84567b546d"} Apr 21 06:29:46.806459 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.806365 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"59f06050-163a-4720-b0dd-b3a1b905e054","Type":"ContainerDied","Data":"b3c55945061f480b099f673083cbabf6eca711c3e41dd01e8777c9d17cc757e5"} Apr 21 06:29:46.806459 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.806364 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:46.806459 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.806378 2570 scope.go:117] "RemoveContainer" containerID="c539ac08fbbb723fdb8627b2d1c79b80ccbdb609d05b3d9c2170514459fd9225" Apr 21 06:29:46.813505 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.813485 2570 scope.go:117] "RemoveContainer" containerID="35cbfcd8b6950303df79e01cf875bdd60f4b12ad25bf75cc13b54070b2860397" Apr 21 06:29:46.820052 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.820036 2570 scope.go:117] "RemoveContainer" containerID="2702324f055405fe47a0addb7c2aeda296f89b2f67691c4881a4731370951bbd" Apr 21 06:29:46.827717 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.827678 2570 scope.go:117] "RemoveContainer" containerID="95cf2d091814b5d9f7ab3caa73e0254f127d65af2e4d562d89570f84567b546d" Apr 21 06:29:46.827817 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.827771 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 06:29:46.832011 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.831734 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 06:29:46.837425 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.837410 2570 scope.go:117] "RemoveContainer" containerID="9935e53142832c96f914fe58e16a8a662349a1628738d4b11c71f97d5f085679" Apr 21 06:29:46.843712 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.843696 2570 scope.go:117] "RemoveContainer" containerID="41f0e82dcc2b1df520e6fe446908c94848800ace40d9efaca27426649ea311a6" Apr 21 06:29:46.849942 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.849924 2570 scope.go:117] "RemoveContainer" containerID="2f5ffb795fed5ab4c61a837bc7fdf0cdc12f8554660a812c16ceeb9fc0987649" Apr 21 06:29:46.856415 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.856389 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 06:29:46.856721 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.856701 2570 scope.go:117] "RemoveContainer" containerID="c539ac08fbbb723fdb8627b2d1c79b80ccbdb609d05b3d9c2170514459fd9225" Apr 21 06:29:46.856876 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.856793 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="kube-rbac-proxy-web" Apr 21 06:29:46.856876 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.856807 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="kube-rbac-proxy-web" Apr 21 06:29:46.856876 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.856836 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="alertmanager" Apr 21 06:29:46.856876 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.856845 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="alertmanager" Apr 21 06:29:46.856876 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.856873 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="prom-label-proxy" Apr 21 06:29:46.857157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.856882 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="prom-label-proxy" Apr 21 06:29:46.857157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.856903 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="kube-rbac-proxy-metric" Apr 21 06:29:46.857157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.856911 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="kube-rbac-proxy-metric" Apr 21 06:29:46.857157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.856920 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="init-config-reloader" Apr 21 06:29:46.857157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.856928 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="init-config-reloader" Apr 21 06:29:46.857157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.856939 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="kube-rbac-proxy" Apr 21 06:29:46.857157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.856946 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="kube-rbac-proxy" Apr 21 06:29:46.857157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.856957 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="config-reloader" Apr 21 06:29:46.857157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.856965 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="config-reloader" Apr 21 06:29:46.857157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.857041 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="prom-label-proxy" Apr 21 06:29:46.857157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.857055 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="kube-rbac-proxy-web" Apr 21 06:29:46.857157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.857068 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="kube-rbac-proxy" Apr 21 06:29:46.857157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.857076 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="alertmanager" Apr 21 06:29:46.857157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.857082 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="kube-rbac-proxy-metric" Apr 21 06:29:46.857157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.857089 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" containerName="config-reloader" Apr 21 06:29:46.857157 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:29:46.857083 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c539ac08fbbb723fdb8627b2d1c79b80ccbdb609d05b3d9c2170514459fd9225\": container with ID starting with c539ac08fbbb723fdb8627b2d1c79b80ccbdb609d05b3d9c2170514459fd9225 not found: ID does not exist" containerID="c539ac08fbbb723fdb8627b2d1c79b80ccbdb609d05b3d9c2170514459fd9225" Apr 21 06:29:46.857157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.857110 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c539ac08fbbb723fdb8627b2d1c79b80ccbdb609d05b3d9c2170514459fd9225"} err="failed to get container status \"c539ac08fbbb723fdb8627b2d1c79b80ccbdb609d05b3d9c2170514459fd9225\": rpc error: code = NotFound desc = could not find container \"c539ac08fbbb723fdb8627b2d1c79b80ccbdb609d05b3d9c2170514459fd9225\": container with ID starting with c539ac08fbbb723fdb8627b2d1c79b80ccbdb609d05b3d9c2170514459fd9225 not found: ID does not exist" Apr 21 06:29:46.857157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.857145 2570 scope.go:117] "RemoveContainer" containerID="35cbfcd8b6950303df79e01cf875bdd60f4b12ad25bf75cc13b54070b2860397" Apr 21 06:29:46.857782 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:29:46.857407 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35cbfcd8b6950303df79e01cf875bdd60f4b12ad25bf75cc13b54070b2860397\": container with ID starting with 35cbfcd8b6950303df79e01cf875bdd60f4b12ad25bf75cc13b54070b2860397 not found: ID does not exist" containerID="35cbfcd8b6950303df79e01cf875bdd60f4b12ad25bf75cc13b54070b2860397" Apr 21 06:29:46.857782 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.857444 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35cbfcd8b6950303df79e01cf875bdd60f4b12ad25bf75cc13b54070b2860397"} err="failed to get container status \"35cbfcd8b6950303df79e01cf875bdd60f4b12ad25bf75cc13b54070b2860397\": rpc error: code = NotFound desc = could not find container \"35cbfcd8b6950303df79e01cf875bdd60f4b12ad25bf75cc13b54070b2860397\": container with ID starting with 35cbfcd8b6950303df79e01cf875bdd60f4b12ad25bf75cc13b54070b2860397 not found: ID does not exist" Apr 21 06:29:46.857782 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.857461 2570 scope.go:117] "RemoveContainer" containerID="2702324f055405fe47a0addb7c2aeda296f89b2f67691c4881a4731370951bbd" Apr 21 06:29:46.857782 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:29:46.857681 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2702324f055405fe47a0addb7c2aeda296f89b2f67691c4881a4731370951bbd\": container with ID starting with 2702324f055405fe47a0addb7c2aeda296f89b2f67691c4881a4731370951bbd not found: ID does not exist" containerID="2702324f055405fe47a0addb7c2aeda296f89b2f67691c4881a4731370951bbd" Apr 21 06:29:46.857782 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.857701 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2702324f055405fe47a0addb7c2aeda296f89b2f67691c4881a4731370951bbd"} err="failed to get container status \"2702324f055405fe47a0addb7c2aeda296f89b2f67691c4881a4731370951bbd\": rpc error: code = NotFound desc = could not find container \"2702324f055405fe47a0addb7c2aeda296f89b2f67691c4881a4731370951bbd\": container with ID starting with 2702324f055405fe47a0addb7c2aeda296f89b2f67691c4881a4731370951bbd not found: ID does not exist" Apr 21 06:29:46.857782 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.857716 2570 scope.go:117] "RemoveContainer" containerID="95cf2d091814b5d9f7ab3caa73e0254f127d65af2e4d562d89570f84567b546d" Apr 21 06:29:46.858041 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:29:46.857933 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95cf2d091814b5d9f7ab3caa73e0254f127d65af2e4d562d89570f84567b546d\": container with ID starting with 95cf2d091814b5d9f7ab3caa73e0254f127d65af2e4d562d89570f84567b546d not found: ID does not exist" containerID="95cf2d091814b5d9f7ab3caa73e0254f127d65af2e4d562d89570f84567b546d" Apr 21 06:29:46.858041 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.857955 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95cf2d091814b5d9f7ab3caa73e0254f127d65af2e4d562d89570f84567b546d"} err="failed to get container status \"95cf2d091814b5d9f7ab3caa73e0254f127d65af2e4d562d89570f84567b546d\": rpc error: code = NotFound desc = could not find container \"95cf2d091814b5d9f7ab3caa73e0254f127d65af2e4d562d89570f84567b546d\": container with ID starting with 95cf2d091814b5d9f7ab3caa73e0254f127d65af2e4d562d89570f84567b546d not found: ID does not exist" Apr 21 06:29:46.858041 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.857971 2570 scope.go:117] "RemoveContainer" containerID="9935e53142832c96f914fe58e16a8a662349a1628738d4b11c71f97d5f085679" Apr 21 06:29:46.858216 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:29:46.858198 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9935e53142832c96f914fe58e16a8a662349a1628738d4b11c71f97d5f085679\": container with ID starting with 9935e53142832c96f914fe58e16a8a662349a1628738d4b11c71f97d5f085679 not found: ID does not exist" containerID="9935e53142832c96f914fe58e16a8a662349a1628738d4b11c71f97d5f085679" Apr 21 06:29:46.858256 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.858222 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9935e53142832c96f914fe58e16a8a662349a1628738d4b11c71f97d5f085679"} err="failed to get container status \"9935e53142832c96f914fe58e16a8a662349a1628738d4b11c71f97d5f085679\": rpc error: code = NotFound desc = could not find container \"9935e53142832c96f914fe58e16a8a662349a1628738d4b11c71f97d5f085679\": container with ID starting with 9935e53142832c96f914fe58e16a8a662349a1628738d4b11c71f97d5f085679 not found: ID does not exist" Apr 21 06:29:46.858256 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.858237 2570 scope.go:117] "RemoveContainer" containerID="41f0e82dcc2b1df520e6fe446908c94848800ace40d9efaca27426649ea311a6" Apr 21 06:29:46.858462 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:29:46.858443 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41f0e82dcc2b1df520e6fe446908c94848800ace40d9efaca27426649ea311a6\": container with ID starting with 41f0e82dcc2b1df520e6fe446908c94848800ace40d9efaca27426649ea311a6 not found: ID does not exist" containerID="41f0e82dcc2b1df520e6fe446908c94848800ace40d9efaca27426649ea311a6" Apr 21 06:29:46.858525 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.858469 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41f0e82dcc2b1df520e6fe446908c94848800ace40d9efaca27426649ea311a6"} err="failed to get container status \"41f0e82dcc2b1df520e6fe446908c94848800ace40d9efaca27426649ea311a6\": rpc error: code = NotFound desc = could not find container \"41f0e82dcc2b1df520e6fe446908c94848800ace40d9efaca27426649ea311a6\": container with ID starting with 41f0e82dcc2b1df520e6fe446908c94848800ace40d9efaca27426649ea311a6 not found: ID does not exist" Apr 21 06:29:46.858525 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.858487 2570 scope.go:117] "RemoveContainer" containerID="2f5ffb795fed5ab4c61a837bc7fdf0cdc12f8554660a812c16ceeb9fc0987649" Apr 21 06:29:46.858712 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:29:46.858698 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f5ffb795fed5ab4c61a837bc7fdf0cdc12f8554660a812c16ceeb9fc0987649\": container with ID starting with 2f5ffb795fed5ab4c61a837bc7fdf0cdc12f8554660a812c16ceeb9fc0987649 not found: ID does not exist" containerID="2f5ffb795fed5ab4c61a837bc7fdf0cdc12f8554660a812c16ceeb9fc0987649" Apr 21 06:29:46.858758 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.858725 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f5ffb795fed5ab4c61a837bc7fdf0cdc12f8554660a812c16ceeb9fc0987649"} err="failed to get container status \"2f5ffb795fed5ab4c61a837bc7fdf0cdc12f8554660a812c16ceeb9fc0987649\": rpc error: code = NotFound desc = could not find container \"2f5ffb795fed5ab4c61a837bc7fdf0cdc12f8554660a812c16ceeb9fc0987649\": container with ID starting with 2f5ffb795fed5ab4c61a837bc7fdf0cdc12f8554660a812c16ceeb9fc0987649 not found: ID does not exist" Apr 21 06:29:46.862360 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.862342 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:46.865026 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.865009 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 06:29:46.865193 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.865064 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 06:29:46.865193 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.865074 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 06:29:46.865193 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.865112 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 06:29:46.865341 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.865327 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 06:29:46.865395 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.865327 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 06:29:46.865545 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.865526 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 06:29:46.865618 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.865561 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-lv9bw\"" Apr 21 06:29:46.865901 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.865885 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 06:29:46.871026 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.871006 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 06:29:46.872805 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.872785 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 06:29:46.965266 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.965227 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:46.965266 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.965271 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/48a4461d-42cd-4b0b-85a1-7553ac766967-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:46.965468 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.965306 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/48a4461d-42cd-4b0b-85a1-7553ac766967-tls-assets\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:46.965468 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.965360 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/48a4461d-42cd-4b0b-85a1-7553ac766967-config-out\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:46.965468 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.965383 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:46.965468 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.965403 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48a4461d-42cd-4b0b-85a1-7553ac766967-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:46.965468 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.965422 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hqxl\" (UniqueName: \"kubernetes.io/projected/48a4461d-42cd-4b0b-85a1-7553ac766967-kube-api-access-9hqxl\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:46.965621 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.965491 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:46.965621 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.965553 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:46.965621 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.965580 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-config-volume\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:46.965621 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.965609 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48a4461d-42cd-4b0b-85a1-7553ac766967-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:46.965737 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.965635 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-web-config\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:46.965737 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:46.965671 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.066631 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.066564 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/48a4461d-42cd-4b0b-85a1-7553ac766967-tls-assets\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.066631 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.066607 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/48a4461d-42cd-4b0b-85a1-7553ac766967-config-out\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.066631 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.066624 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.066832 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.066644 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48a4461d-42cd-4b0b-85a1-7553ac766967-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.066832 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.066766 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hqxl\" (UniqueName: \"kubernetes.io/projected/48a4461d-42cd-4b0b-85a1-7553ac766967-kube-api-access-9hqxl\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.066832 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.066798 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.066832 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.066829 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.067027 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.066850 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-config-volume\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.067027 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.066901 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48a4461d-42cd-4b0b-85a1-7553ac766967-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.067027 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.066946 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-web-config\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.067027 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.066972 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.067027 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.067021 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.067251 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.067055 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/48a4461d-42cd-4b0b-85a1-7553ac766967-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.067367 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.067344 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/48a4461d-42cd-4b0b-85a1-7553ac766967-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.068177 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.067796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48a4461d-42cd-4b0b-85a1-7553ac766967-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.069726 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.069597 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/48a4461d-42cd-4b0b-85a1-7553ac766967-config-out\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.069726 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.069710 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/48a4461d-42cd-4b0b-85a1-7553ac766967-tls-assets\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.070173 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.069848 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.070173 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.070049 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.070290 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.070188 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.070447 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.070324 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-web-config\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.070447 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.070341 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48a4461d-42cd-4b0b-85a1-7553ac766967-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.070567 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.070511 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.070678 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.070659 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.071721 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.071698 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/48a4461d-42cd-4b0b-85a1-7553ac766967-config-volume\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.074176 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.074159 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hqxl\" (UniqueName: \"kubernetes.io/projected/48a4461d-42cd-4b0b-85a1-7553ac766967-kube-api-access-9hqxl\") pod \"alertmanager-main-0\" (UID: \"48a4461d-42cd-4b0b-85a1-7553ac766967\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.082210 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.082191 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59f06050-163a-4720-b0dd-b3a1b905e054" path="/var/lib/kubelet/pods/59f06050-163a-4720-b0dd-b3a1b905e054/volumes" Apr 21 06:29:47.170547 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.170510 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 06:29:47.301112 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.301084 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 06:29:47.303505 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:29:47.303466 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48a4461d_42cd_4b0b_85a1_7553ac766967.slice/crio-401ec73a9e215f10235a406012057c3f90a81b903873707613d7fa657d154e44 WatchSource:0}: Error finding container 401ec73a9e215f10235a406012057c3f90a81b903873707613d7fa657d154e44: Status 404 returned error can't find the container with id 401ec73a9e215f10235a406012057c3f90a81b903873707613d7fa657d154e44 Apr 21 06:29:47.815342 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.815309 2570 generic.go:358] "Generic (PLEG): container finished" podID="48a4461d-42cd-4b0b-85a1-7553ac766967" containerID="ac64a3ba80c51d0d36d059949120f7d6323a81922a8c7f3bc0139df610be445f" exitCode=0 Apr 21 06:29:47.815504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.815396 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"48a4461d-42cd-4b0b-85a1-7553ac766967","Type":"ContainerDied","Data":"ac64a3ba80c51d0d36d059949120f7d6323a81922a8c7f3bc0139df610be445f"} Apr 21 06:29:47.815504 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:47.815438 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"48a4461d-42cd-4b0b-85a1-7553ac766967","Type":"ContainerStarted","Data":"401ec73a9e215f10235a406012057c3f90a81b903873707613d7fa657d154e44"} Apr 21 06:29:48.822960 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:48.822930 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"48a4461d-42cd-4b0b-85a1-7553ac766967","Type":"ContainerStarted","Data":"0f931dd9881cf97ff2752cc60efac2543da4fc52a753dbc0e6891007f8f2130b"} Apr 21 06:29:48.822960 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:48.822965 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"48a4461d-42cd-4b0b-85a1-7553ac766967","Type":"ContainerStarted","Data":"ec1f06deef015026e3269b4475d6e56e26c2f24473d5a50ea5cff4e62aa02963"} Apr 21 06:29:48.823337 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:48.822975 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"48a4461d-42cd-4b0b-85a1-7553ac766967","Type":"ContainerStarted","Data":"67175d2b52c5c3aae1ef1e9666f51d04951cc4632ac40e275adbbad166e2d514"} Apr 21 06:29:48.823337 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:48.822983 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"48a4461d-42cd-4b0b-85a1-7553ac766967","Type":"ContainerStarted","Data":"179729b7d8f76e3772ea5765f0cc067dc92ea96a5e7e957049e0bca2adad9722"} Apr 21 06:29:48.823337 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:48.822991 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"48a4461d-42cd-4b0b-85a1-7553ac766967","Type":"ContainerStarted","Data":"565158d25e0b35d3bf9eb065bfd40aedf3c30806653091769af49d35a42decac"} Apr 21 06:29:48.823337 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:48.822999 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"48a4461d-42cd-4b0b-85a1-7553ac766967","Type":"ContainerStarted","Data":"92213918383fc34a7f9d7d15d47739228a70720fec023de4edfa55a34870cf4a"} Apr 21 06:29:48.849417 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:48.849376 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.849362282 podStartE2EDuration="2.849362282s" podCreationTimestamp="2026-04-21 06:29:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:29:48.847145703 +0000 UTC m=+226.353610649" watchObservedRunningTime="2026-04-21 06:29:48.849362282 +0000 UTC m=+226.355827194" Apr 21 06:29:49.172116 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.172082 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 06:29:49.172627 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.172548 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="kube-rbac-proxy" containerID="cri-o://67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c" gracePeriod=600 Apr 21 06:29:49.172627 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.172580 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="kube-rbac-proxy-web" containerID="cri-o://3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a" gracePeriod=600 Apr 21 06:29:49.172816 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.172534 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="prometheus" containerID="cri-o://0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb" gracePeriod=600 Apr 21 06:29:49.172816 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.172548 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="thanos-sidecar" containerID="cri-o://2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5" gracePeriod=600 Apr 21 06:29:49.172816 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.172577 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="kube-rbac-proxy-thanos" containerID="cri-o://9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d" gracePeriod=600 Apr 21 06:29:49.172816 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.172580 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="config-reloader" containerID="cri-o://5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699" gracePeriod=600 Apr 21 06:29:49.410482 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.410459 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.485689 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.485615 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-prometheus-k8s-tls\") pod \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " Apr 21 06:29:49.485689 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.485652 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-configmap-kubelet-serving-ca-bundle\") pod \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " Apr 21 06:29:49.485689 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.485676 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6e934c6a-c383-4e26-9283-e31dd7b3c42c-prometheus-k8s-db\") pod \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " Apr 21 06:29:49.485942 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.485697 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-grpc-tls\") pod \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " Apr 21 06:29:49.485942 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.485729 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " Apr 21 06:29:49.485942 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.485763 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-thanos-prometheus-http-client-file\") pod \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " Apr 21 06:29:49.485942 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.485791 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-config\") pod \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " Apr 21 06:29:49.487794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.486212 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-prometheus-k8s-rulefiles-0\") pod \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " Apr 21 06:29:49.487794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.486251 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "6e934c6a-c383-4e26-9283-e31dd7b3c42c" (UID: "6e934c6a-c383-4e26-9283-e31dd7b3c42c"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:29:49.487794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.486273 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-configmap-metrics-client-ca\") pod \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " Apr 21 06:29:49.487794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.486302 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-kube-rbac-proxy\") pod \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " Apr 21 06:29:49.487794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.486341 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-prometheus-trusted-ca-bundle\") pod \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " Apr 21 06:29:49.487794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.486382 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-metrics-client-certs\") pod \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " Apr 21 06:29:49.487794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.486405 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6e934c6a-c383-4e26-9283-e31dd7b3c42c-config-out\") pod \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " Apr 21 06:29:49.487794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.486465 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " Apr 21 06:29:49.487794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.486499 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-web-config\") pod \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " Apr 21 06:29:49.487794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.486525 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8xs5\" (UniqueName: \"kubernetes.io/projected/6e934c6a-c383-4e26-9283-e31dd7b3c42c-kube-api-access-s8xs5\") pod \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " Apr 21 06:29:49.487794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.486562 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6e934c6a-c383-4e26-9283-e31dd7b3c42c-tls-assets\") pod \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " Apr 21 06:29:49.487794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.486587 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-configmap-serving-certs-ca-bundle\") pod \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\" (UID: \"6e934c6a-c383-4e26-9283-e31dd7b3c42c\") " Apr 21 06:29:49.487794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.486829 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:49.487794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.487151 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "6e934c6a-c383-4e26-9283-e31dd7b3c42c" (UID: "6e934c6a-c383-4e26-9283-e31dd7b3c42c"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:29:49.487794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.487169 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e934c6a-c383-4e26-9283-e31dd7b3c42c-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "6e934c6a-c383-4e26-9283-e31dd7b3c42c" (UID: "6e934c6a-c383-4e26-9283-e31dd7b3c42c"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 06:29:49.487794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.487517 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "6e934c6a-c383-4e26-9283-e31dd7b3c42c" (UID: "6e934c6a-c383-4e26-9283-e31dd7b3c42c"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:29:49.488794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.487686 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "6e934c6a-c383-4e26-9283-e31dd7b3c42c" (UID: "6e934c6a-c383-4e26-9283-e31dd7b3c42c"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:29:49.488794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.488239 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "6e934c6a-c383-4e26-9283-e31dd7b3c42c" (UID: "6e934c6a-c383-4e26-9283-e31dd7b3c42c"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:29:49.488794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.488323 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "6e934c6a-c383-4e26-9283-e31dd7b3c42c" (UID: "6e934c6a-c383-4e26-9283-e31dd7b3c42c"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:29:49.488794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.488579 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "6e934c6a-c383-4e26-9283-e31dd7b3c42c" (UID: "6e934c6a-c383-4e26-9283-e31dd7b3c42c"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:29:49.488794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.488702 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "6e934c6a-c383-4e26-9283-e31dd7b3c42c" (UID: "6e934c6a-c383-4e26-9283-e31dd7b3c42c"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:29:49.488794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.488791 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-config" (OuterVolumeSpecName: "config") pod "6e934c6a-c383-4e26-9283-e31dd7b3c42c" (UID: "6e934c6a-c383-4e26-9283-e31dd7b3c42c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:29:49.489134 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.488806 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "6e934c6a-c383-4e26-9283-e31dd7b3c42c" (UID: "6e934c6a-c383-4e26-9283-e31dd7b3c42c"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:29:49.490006 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.489968 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e934c6a-c383-4e26-9283-e31dd7b3c42c-config-out" (OuterVolumeSpecName: "config-out") pod "6e934c6a-c383-4e26-9283-e31dd7b3c42c" (UID: "6e934c6a-c383-4e26-9283-e31dd7b3c42c"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 06:29:49.490179 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.490154 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "6e934c6a-c383-4e26-9283-e31dd7b3c42c" (UID: "6e934c6a-c383-4e26-9283-e31dd7b3c42c"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:29:49.490537 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.490516 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "6e934c6a-c383-4e26-9283-e31dd7b3c42c" (UID: "6e934c6a-c383-4e26-9283-e31dd7b3c42c"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:29:49.490537 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.490525 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e934c6a-c383-4e26-9283-e31dd7b3c42c-kube-api-access-s8xs5" (OuterVolumeSpecName: "kube-api-access-s8xs5") pod "6e934c6a-c383-4e26-9283-e31dd7b3c42c" (UID: "6e934c6a-c383-4e26-9283-e31dd7b3c42c"). InnerVolumeSpecName "kube-api-access-s8xs5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:29:49.490795 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.490779 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e934c6a-c383-4e26-9283-e31dd7b3c42c-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "6e934c6a-c383-4e26-9283-e31dd7b3c42c" (UID: "6e934c6a-c383-4e26-9283-e31dd7b3c42c"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:29:49.490883 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.490842 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "6e934c6a-c383-4e26-9283-e31dd7b3c42c" (UID: "6e934c6a-c383-4e26-9283-e31dd7b3c42c"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:29:49.499727 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.499703 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-web-config" (OuterVolumeSpecName: "web-config") pod "6e934c6a-c383-4e26-9283-e31dd7b3c42c" (UID: "6e934c6a-c383-4e26-9283-e31dd7b3c42c"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:29:49.587546 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.587516 2570 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-metrics-client-certs\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:49.587546 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.587540 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6e934c6a-c383-4e26-9283-e31dd7b3c42c-config-out\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:49.587546 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.587551 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:49.587750 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.587561 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-web-config\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:49.587750 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.587571 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s8xs5\" (UniqueName: \"kubernetes.io/projected/6e934c6a-c383-4e26-9283-e31dd7b3c42c-kube-api-access-s8xs5\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:49.587750 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.587579 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6e934c6a-c383-4e26-9283-e31dd7b3c42c-tls-assets\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:49.587750 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.587588 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:49.587750 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.587599 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-prometheus-k8s-tls\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:49.587750 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.587607 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6e934c6a-c383-4e26-9283-e31dd7b3c42c-prometheus-k8s-db\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:49.587750 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.587616 2570 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-grpc-tls\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:49.587750 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.587626 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:49.587750 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.587635 2570 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-thanos-prometheus-http-client-file\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:49.587750 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.587643 2570 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-config\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:49.587750 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.587653 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:49.587750 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.587662 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-configmap-metrics-client-ca\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:49.587750 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.587671 2570 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6e934c6a-c383-4e26-9283-e31dd7b3c42c-secret-kube-rbac-proxy\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:49.587750 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.587679 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e934c6a-c383-4e26-9283-e31dd7b3c42c-prometheus-trusted-ca-bundle\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:29:49.828647 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.828561 2570 generic.go:358] "Generic (PLEG): container finished" podID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerID="9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d" exitCode=0 Apr 21 06:29:49.828647 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.828585 2570 generic.go:358] "Generic (PLEG): container finished" podID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerID="67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c" exitCode=0 Apr 21 06:29:49.828647 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.828592 2570 generic.go:358] "Generic (PLEG): container finished" podID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerID="3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a" exitCode=0 Apr 21 06:29:49.828647 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.828598 2570 generic.go:358] "Generic (PLEG): container finished" podID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerID="2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5" exitCode=0 Apr 21 06:29:49.828647 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.828604 2570 generic.go:358] "Generic (PLEG): container finished" podID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerID="5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699" exitCode=0 Apr 21 06:29:49.828647 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.828609 2570 generic.go:358] "Generic (PLEG): container finished" podID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerID="0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb" exitCode=0 Apr 21 06:29:49.829255 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.828649 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e934c6a-c383-4e26-9283-e31dd7b3c42c","Type":"ContainerDied","Data":"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d"} Apr 21 06:29:49.829255 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.828692 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e934c6a-c383-4e26-9283-e31dd7b3c42c","Type":"ContainerDied","Data":"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c"} Apr 21 06:29:49.829255 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.828664 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.829255 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.828720 2570 scope.go:117] "RemoveContainer" containerID="9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d" Apr 21 06:29:49.829255 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.828708 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e934c6a-c383-4e26-9283-e31dd7b3c42c","Type":"ContainerDied","Data":"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a"} Apr 21 06:29:49.829255 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.828808 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e934c6a-c383-4e26-9283-e31dd7b3c42c","Type":"ContainerDied","Data":"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5"} Apr 21 06:29:49.829255 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.828829 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e934c6a-c383-4e26-9283-e31dd7b3c42c","Type":"ContainerDied","Data":"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699"} Apr 21 06:29:49.829255 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.828843 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e934c6a-c383-4e26-9283-e31dd7b3c42c","Type":"ContainerDied","Data":"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb"} Apr 21 06:29:49.829255 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.828874 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6e934c6a-c383-4e26-9283-e31dd7b3c42c","Type":"ContainerDied","Data":"8425844536f389c57e2fa166e63384c1bf12bed020d80aa4fa94f7e52bd11c26"} Apr 21 06:29:49.836488 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.836456 2570 scope.go:117] "RemoveContainer" containerID="67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c" Apr 21 06:29:49.842929 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.842907 2570 scope.go:117] "RemoveContainer" containerID="3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a" Apr 21 06:29:49.848939 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.848920 2570 scope.go:117] "RemoveContainer" containerID="2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5" Apr 21 06:29:49.851110 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.851084 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 06:29:49.855456 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.855436 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 06:29:49.855917 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.855873 2570 scope.go:117] "RemoveContainer" containerID="5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699" Apr 21 06:29:49.863564 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.863550 2570 scope.go:117] "RemoveContainer" containerID="0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb" Apr 21 06:29:49.869994 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.869978 2570 scope.go:117] "RemoveContainer" containerID="0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f" Apr 21 06:29:49.876009 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.875989 2570 scope.go:117] "RemoveContainer" containerID="9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d" Apr 21 06:29:49.876238 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:29:49.876223 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d\": container with ID starting with 9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d not found: ID does not exist" containerID="9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d" Apr 21 06:29:49.876291 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.876246 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d"} err="failed to get container status \"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d\": rpc error: code = NotFound desc = could not find container \"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d\": container with ID starting with 9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d not found: ID does not exist" Apr 21 06:29:49.876291 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.876265 2570 scope.go:117] "RemoveContainer" containerID="67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c" Apr 21 06:29:49.876466 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:29:49.876449 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c\": container with ID starting with 67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c not found: ID does not exist" containerID="67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c" Apr 21 06:29:49.876520 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.876475 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c"} err="failed to get container status \"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c\": rpc error: code = NotFound desc = could not find container \"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c\": container with ID starting with 67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c not found: ID does not exist" Apr 21 06:29:49.876520 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.876500 2570 scope.go:117] "RemoveContainer" containerID="3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a" Apr 21 06:29:49.876740 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:29:49.876725 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a\": container with ID starting with 3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a not found: ID does not exist" containerID="3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a" Apr 21 06:29:49.876794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.876745 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a"} err="failed to get container status \"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a\": rpc error: code = NotFound desc = could not find container \"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a\": container with ID starting with 3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a not found: ID does not exist" Apr 21 06:29:49.876794 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.876764 2570 scope.go:117] "RemoveContainer" containerID="2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5" Apr 21 06:29:49.877117 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:29:49.877100 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5\": container with ID starting with 2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5 not found: ID does not exist" containerID="2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5" Apr 21 06:29:49.877182 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.877122 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5"} err="failed to get container status \"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5\": rpc error: code = NotFound desc = could not find container \"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5\": container with ID starting with 2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5 not found: ID does not exist" Apr 21 06:29:49.877182 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.877142 2570 scope.go:117] "RemoveContainer" containerID="5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699" Apr 21 06:29:49.877349 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:29:49.877334 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699\": container with ID starting with 5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699 not found: ID does not exist" containerID="5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699" Apr 21 06:29:49.877408 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.877354 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699"} err="failed to get container status \"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699\": rpc error: code = NotFound desc = could not find container \"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699\": container with ID starting with 5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699 not found: ID does not exist" Apr 21 06:29:49.877408 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.877372 2570 scope.go:117] "RemoveContainer" containerID="0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb" Apr 21 06:29:49.877564 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:29:49.877549 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb\": container with ID starting with 0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb not found: ID does not exist" containerID="0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb" Apr 21 06:29:49.877629 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.877568 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb"} err="failed to get container status \"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb\": rpc error: code = NotFound desc = could not find container \"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb\": container with ID starting with 0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb not found: ID does not exist" Apr 21 06:29:49.877629 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.877586 2570 scope.go:117] "RemoveContainer" containerID="0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f" Apr 21 06:29:49.877818 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:29:49.877796 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f\": container with ID starting with 0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f not found: ID does not exist" containerID="0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f" Apr 21 06:29:49.877892 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.877826 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f"} err="failed to get container status \"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f\": rpc error: code = NotFound desc = could not find container \"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f\": container with ID starting with 0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f not found: ID does not exist" Apr 21 06:29:49.877892 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.877846 2570 scope.go:117] "RemoveContainer" containerID="9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d" Apr 21 06:29:49.878163 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.878123 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d"} err="failed to get container status \"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d\": rpc error: code = NotFound desc = could not find container \"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d\": container with ID starting with 9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d not found: ID does not exist" Apr 21 06:29:49.878163 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.878149 2570 scope.go:117] "RemoveContainer" containerID="67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c" Apr 21 06:29:49.878585 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.878387 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c"} err="failed to get container status \"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c\": rpc error: code = NotFound desc = could not find container \"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c\": container with ID starting with 67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c not found: ID does not exist" Apr 21 06:29:49.878585 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.878409 2570 scope.go:117] "RemoveContainer" containerID="3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a" Apr 21 06:29:49.878754 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.878669 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a"} err="failed to get container status \"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a\": rpc error: code = NotFound desc = could not find container \"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a\": container with ID starting with 3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a not found: ID does not exist" Apr 21 06:29:49.878754 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.878687 2570 scope.go:117] "RemoveContainer" containerID="2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5" Apr 21 06:29:49.878946 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.878922 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5"} err="failed to get container status \"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5\": rpc error: code = NotFound desc = could not find container \"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5\": container with ID starting with 2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5 not found: ID does not exist" Apr 21 06:29:49.879012 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.878949 2570 scope.go:117] "RemoveContainer" containerID="5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699" Apr 21 06:29:49.879222 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.879184 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699"} err="failed to get container status \"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699\": rpc error: code = NotFound desc = could not find container \"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699\": container with ID starting with 5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699 not found: ID does not exist" Apr 21 06:29:49.879277 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.879224 2570 scope.go:117] "RemoveContainer" containerID="0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb" Apr 21 06:29:49.879471 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.879448 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb"} err="failed to get container status \"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb\": rpc error: code = NotFound desc = could not find container \"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb\": container with ID starting with 0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb not found: ID does not exist" Apr 21 06:29:49.879532 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.879472 2570 scope.go:117] "RemoveContainer" containerID="0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f" Apr 21 06:29:49.879658 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.879641 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 06:29:49.879710 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.879685 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f"} err="failed to get container status \"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f\": rpc error: code = NotFound desc = could not find container \"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f\": container with ID starting with 0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f not found: ID does not exist" Apr 21 06:29:49.879710 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.879701 2570 scope.go:117] "RemoveContainer" containerID="9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d" Apr 21 06:29:49.879921 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.879900 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d"} err="failed to get container status \"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d\": rpc error: code = NotFound desc = could not find container \"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d\": container with ID starting with 9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d not found: ID does not exist" Apr 21 06:29:49.879993 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.879923 2570 scope.go:117] "RemoveContainer" containerID="67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c" Apr 21 06:29:49.880799 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880441 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="kube-rbac-proxy-web" Apr 21 06:29:49.880799 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880465 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="kube-rbac-proxy-web" Apr 21 06:29:49.880799 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880497 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="init-config-reloader" Apr 21 06:29:49.880799 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880506 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="init-config-reloader" Apr 21 06:29:49.880799 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880519 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="config-reloader" Apr 21 06:29:49.880799 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880528 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="config-reloader" Apr 21 06:29:49.880799 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880550 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="kube-rbac-proxy" Apr 21 06:29:49.880799 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880559 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="kube-rbac-proxy" Apr 21 06:29:49.880799 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880584 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="prometheus" Apr 21 06:29:49.880799 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880592 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="prometheus" Apr 21 06:29:49.880799 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880601 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="kube-rbac-proxy-thanos" Apr 21 06:29:49.880799 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880609 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="kube-rbac-proxy-thanos" Apr 21 06:29:49.880799 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880628 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="thanos-sidecar" Apr 21 06:29:49.880799 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880636 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="thanos-sidecar" Apr 21 06:29:49.880799 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880790 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="prometheus" Apr 21 06:29:49.881473 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880812 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="kube-rbac-proxy-web" Apr 21 06:29:49.881473 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880822 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="config-reloader" Apr 21 06:29:49.881473 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880833 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="kube-rbac-proxy-thanos" Apr 21 06:29:49.881473 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880850 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="thanos-sidecar" Apr 21 06:29:49.881473 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.880922 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" containerName="kube-rbac-proxy" Apr 21 06:29:49.885209 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.885153 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c"} err="failed to get container status \"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c\": rpc error: code = NotFound desc = could not find container \"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c\": container with ID starting with 67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c not found: ID does not exist" Apr 21 06:29:49.885326 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.885312 2570 scope.go:117] "RemoveContainer" containerID="3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a" Apr 21 06:29:49.885771 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.885752 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a"} err="failed to get container status \"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a\": rpc error: code = NotFound desc = could not find container \"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a\": container with ID starting with 3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a not found: ID does not exist" Apr 21 06:29:49.885833 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.885772 2570 scope.go:117] "RemoveContainer" containerID="2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5" Apr 21 06:29:49.886022 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.886004 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5"} err="failed to get container status \"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5\": rpc error: code = NotFound desc = could not find container \"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5\": container with ID starting with 2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5 not found: ID does not exist" Apr 21 06:29:49.886072 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.886029 2570 scope.go:117] "RemoveContainer" containerID="5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699" Apr 21 06:29:49.886287 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.886264 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699"} err="failed to get container status \"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699\": rpc error: code = NotFound desc = could not find container \"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699\": container with ID starting with 5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699 not found: ID does not exist" Apr 21 06:29:49.886366 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.886288 2570 scope.go:117] "RemoveContainer" containerID="0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb" Apr 21 06:29:49.886533 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.886510 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb"} err="failed to get container status \"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb\": rpc error: code = NotFound desc = could not find container \"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb\": container with ID starting with 0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb not found: ID does not exist" Apr 21 06:29:49.886593 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.886537 2570 scope.go:117] "RemoveContainer" containerID="0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f" Apr 21 06:29:49.886766 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.886749 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f"} err="failed to get container status \"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f\": rpc error: code = NotFound desc = could not find container \"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f\": container with ID starting with 0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f not found: ID does not exist" Apr 21 06:29:49.886819 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.886767 2570 scope.go:117] "RemoveContainer" containerID="9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d" Apr 21 06:29:49.887101 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.887065 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d"} err="failed to get container status \"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d\": rpc error: code = NotFound desc = could not find container \"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d\": container with ID starting with 9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d not found: ID does not exist" Apr 21 06:29:49.887170 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.887105 2570 scope.go:117] "RemoveContainer" containerID="67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c" Apr 21 06:29:49.887417 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.887388 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c"} err="failed to get container status \"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c\": rpc error: code = NotFound desc = could not find container \"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c\": container with ID starting with 67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c not found: ID does not exist" Apr 21 06:29:49.887498 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.887430 2570 scope.go:117] "RemoveContainer" containerID="3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a" Apr 21 06:29:49.887747 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.887722 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a"} err="failed to get container status \"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a\": rpc error: code = NotFound desc = could not find container \"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a\": container with ID starting with 3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a not found: ID does not exist" Apr 21 06:29:49.887799 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.887755 2570 scope.go:117] "RemoveContainer" containerID="2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5" Apr 21 06:29:49.887847 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.887736 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.888276 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.888054 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5"} err="failed to get container status \"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5\": rpc error: code = NotFound desc = could not find container \"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5\": container with ID starting with 2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5 not found: ID does not exist" Apr 21 06:29:49.888276 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.888079 2570 scope.go:117] "RemoveContainer" containerID="5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699" Apr 21 06:29:49.888462 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.888443 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699"} err="failed to get container status \"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699\": rpc error: code = NotFound desc = could not find container \"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699\": container with ID starting with 5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699 not found: ID does not exist" Apr 21 06:29:49.888518 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.888465 2570 scope.go:117] "RemoveContainer" containerID="0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb" Apr 21 06:29:49.888724 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.888695 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb"} err="failed to get container status \"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb\": rpc error: code = NotFound desc = could not find container \"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb\": container with ID starting with 0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb not found: ID does not exist" Apr 21 06:29:49.888724 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.888724 2570 scope.go:117] "RemoveContainer" containerID="0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f" Apr 21 06:29:49.888951 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.888929 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f"} err="failed to get container status \"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f\": rpc error: code = NotFound desc = could not find container \"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f\": container with ID starting with 0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f not found: ID does not exist" Apr 21 06:29:49.889021 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.888956 2570 scope.go:117] "RemoveContainer" containerID="9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d" Apr 21 06:29:49.889206 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.889187 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d"} err="failed to get container status \"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d\": rpc error: code = NotFound desc = could not find container \"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d\": container with ID starting with 9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d not found: ID does not exist" Apr 21 06:29:49.889206 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.889205 2570 scope.go:117] "RemoveContainer" containerID="67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c" Apr 21 06:29:49.889516 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.889407 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c"} err="failed to get container status \"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c\": rpc error: code = NotFound desc = could not find container \"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c\": container with ID starting with 67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c not found: ID does not exist" Apr 21 06:29:49.889516 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.889435 2570 scope.go:117] "RemoveContainer" containerID="3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a" Apr 21 06:29:49.889789 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.889726 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a"} err="failed to get container status \"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a\": rpc error: code = NotFound desc = could not find container \"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a\": container with ID starting with 3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a not found: ID does not exist" Apr 21 06:29:49.889789 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.889751 2570 scope.go:117] "RemoveContainer" containerID="2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5" Apr 21 06:29:49.890046 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.890020 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5"} err="failed to get container status \"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5\": rpc error: code = NotFound desc = could not find container \"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5\": container with ID starting with 2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5 not found: ID does not exist" Apr 21 06:29:49.890139 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.890048 2570 scope.go:117] "RemoveContainer" containerID="5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699" Apr 21 06:29:49.890282 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.890260 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699"} err="failed to get container status \"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699\": rpc error: code = NotFound desc = could not find container \"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699\": container with ID starting with 5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699 not found: ID does not exist" Apr 21 06:29:49.890343 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.890284 2570 scope.go:117] "RemoveContainer" containerID="0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb" Apr 21 06:29:49.890472 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.890454 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 06:29:49.890630 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.890558 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 06:29:49.890630 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.890592 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 06:29:49.890630 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.890570 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb"} err="failed to get container status \"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb\": rpc error: code = NotFound desc = could not find container \"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb\": container with ID starting with 0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb not found: ID does not exist" Apr 21 06:29:49.890630 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.890624 2570 scope.go:117] "RemoveContainer" containerID="0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f" Apr 21 06:29:49.890887 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.890750 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-9c34ncig7c1ms\"" Apr 21 06:29:49.890887 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.890848 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 06:29:49.890997 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.890908 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 06:29:49.890997 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.890904 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f"} err="failed to get container status \"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f\": rpc error: code = NotFound desc = could not find container \"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f\": container with ID starting with 0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f not found: ID does not exist" Apr 21 06:29:49.890997 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.890983 2570 scope.go:117] "RemoveContainer" containerID="9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d" Apr 21 06:29:49.891138 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.891006 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-v59qm\"" Apr 21 06:29:49.891138 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.891057 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 06:29:49.891236 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.891169 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 06:29:49.891236 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.891223 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d"} err="failed to get container status \"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d\": rpc error: code = NotFound desc = could not find container \"9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d\": container with ID starting with 9739cadd859ee6d41a40123758bea48cdcaa37c13690299f4c101c56a7ea3a4d not found: ID does not exist" Apr 21 06:29:49.891335 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.891243 2570 scope.go:117] "RemoveContainer" containerID="67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c" Apr 21 06:29:49.891335 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.891264 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 06:29:49.891335 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.891308 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 06:29:49.891481 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.891458 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c"} err="failed to get container status \"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c\": rpc error: code = NotFound desc = could not find container \"67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c\": container with ID starting with 67a5e8ceb127db2a8097572e73ba16cfde5dcbb8427cfb7e83d7bdefe079ee3c not found: ID does not exist" Apr 21 06:29:49.891663 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.891484 2570 scope.go:117] "RemoveContainer" containerID="3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a" Apr 21 06:29:49.891663 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.891531 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 06:29:49.891663 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.891625 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 06:29:49.891837 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.891794 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a"} err="failed to get container status \"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a\": rpc error: code = NotFound desc = could not find container \"3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a\": container with ID starting with 3b411d2dde2b9d21931d0b54fd2873006ed17bec52639b56905026475ccf034a not found: ID does not exist" Apr 21 06:29:49.891982 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.891841 2570 scope.go:117] "RemoveContainer" containerID="2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5" Apr 21 06:29:49.892211 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.892185 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5"} err="failed to get container status \"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5\": rpc error: code = NotFound desc = could not find container \"2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5\": container with ID starting with 2118a05968b13094aee007f6ed48b6458ecd7a824067d7a6d8612011f717b8f5 not found: ID does not exist" Apr 21 06:29:49.892293 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.892213 2570 scope.go:117] "RemoveContainer" containerID="5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699" Apr 21 06:29:49.892509 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.892476 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699"} err="failed to get container status \"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699\": rpc error: code = NotFound desc = could not find container \"5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699\": container with ID starting with 5437578728d91f8af79fae86c2556a28e4afe245685698ad3c1bb1642cc8b699 not found: ID does not exist" Apr 21 06:29:49.892509 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.892503 2570 scope.go:117] "RemoveContainer" containerID="0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb" Apr 21 06:29:49.892960 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.892761 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb"} err="failed to get container status \"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb\": rpc error: code = NotFound desc = could not find container \"0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb\": container with ID starting with 0ebd41262e3819446c6494c035888f5525d9ef7e50840e8aada6ef4c9e4c15cb not found: ID does not exist" Apr 21 06:29:49.892960 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.892784 2570 scope.go:117] "RemoveContainer" containerID="0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f" Apr 21 06:29:49.893157 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.893115 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f"} err="failed to get container status \"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f\": rpc error: code = NotFound desc = could not find container \"0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f\": container with ID starting with 0b8cfebd31b245dd940285e8c8676c58c9aef2c4934eb84054f68e3ef9cde30f not found: ID does not exist" Apr 21 06:29:49.893723 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.893563 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 06:29:49.896315 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.896297 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 06:29:49.897585 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.897569 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 06:29:49.989962 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.989933 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.989962 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.989968 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a301cbe9-f6d5-415b-9c49-98705de9960a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.990166 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.989992 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a301cbe9-f6d5-415b-9c49-98705de9960a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.990166 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.990059 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a301cbe9-f6d5-415b-9c49-98705de9960a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.990166 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.990086 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a301cbe9-f6d5-415b-9c49-98705de9960a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.990166 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.990105 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.990166 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.990120 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.990166 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.990152 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-web-config\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.990342 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.990190 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-config\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.990342 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.990207 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.990342 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.990255 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.990342 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.990270 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a301cbe9-f6d5-415b-9c49-98705de9960a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.990342 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.990284 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a301cbe9-f6d5-415b-9c49-98705de9960a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.990342 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.990312 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a301cbe9-f6d5-415b-9c49-98705de9960a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.990342 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.990336 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a301cbe9-f6d5-415b-9c49-98705de9960a-config-out\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.990530 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.990362 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.990530 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.990386 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:49.990530 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:49.990400 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7rqm\" (UniqueName: \"kubernetes.io/projected/a301cbe9-f6d5-415b-9c49-98705de9960a-kube-api-access-m7rqm\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.090987 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.090909 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a301cbe9-f6d5-415b-9c49-98705de9960a-config-out\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.090987 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.090943 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.090987 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.090965 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.091187 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.091080 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7rqm\" (UniqueName: \"kubernetes.io/projected/a301cbe9-f6d5-415b-9c49-98705de9960a-kube-api-access-m7rqm\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.091187 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.091121 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.091187 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.091149 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a301cbe9-f6d5-415b-9c49-98705de9960a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.091187 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.091177 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a301cbe9-f6d5-415b-9c49-98705de9960a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.091365 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.091209 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a301cbe9-f6d5-415b-9c49-98705de9960a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.091489 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.091462 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a301cbe9-f6d5-415b-9c49-98705de9960a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.091573 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.091512 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.091573 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.091541 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.091573 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.091570 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-web-config\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.091718 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.091611 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-config\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.091718 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.091640 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.091718 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.091693 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.091879 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.091717 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a301cbe9-f6d5-415b-9c49-98705de9960a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.091879 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.091746 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a301cbe9-f6d5-415b-9c49-98705de9960a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.091879 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.091803 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a301cbe9-f6d5-415b-9c49-98705de9960a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.092031 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.091902 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a301cbe9-f6d5-415b-9c49-98705de9960a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.092031 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.091902 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a301cbe9-f6d5-415b-9c49-98705de9960a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.092031 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.091942 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a301cbe9-f6d5-415b-9c49-98705de9960a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.092235 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.092131 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a301cbe9-f6d5-415b-9c49-98705de9960a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.092786 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.092757 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a301cbe9-f6d5-415b-9c49-98705de9960a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.094142 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.094107 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.094231 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.094182 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.094821 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.094442 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a301cbe9-f6d5-415b-9c49-98705de9960a-config-out\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.094821 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.094502 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-config\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.094821 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.094760 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.094821 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.094801 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.095568 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.095541 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.096048 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.096024 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a301cbe9-f6d5-415b-9c49-98705de9960a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.096202 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.096184 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.096564 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.096543 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.096704 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.096684 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a301cbe9-f6d5-415b-9c49-98705de9960a-web-config\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.096844 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.096829 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a301cbe9-f6d5-415b-9c49-98705de9960a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.098158 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.098142 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7rqm\" (UniqueName: \"kubernetes.io/projected/a301cbe9-f6d5-415b-9c49-98705de9960a-kube-api-access-m7rqm\") pod \"prometheus-k8s-0\" (UID: \"a301cbe9-f6d5-415b-9c49-98705de9960a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.199762 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.199732 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:29:50.321025 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.320945 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 06:29:50.323035 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:29:50.323003 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda301cbe9_f6d5_415b_9c49_98705de9960a.slice/crio-dbbbb1cdf12104ded8a5dc8079cd9ed20419ca448ce3d2ea2b577a1d2e9af547 WatchSource:0}: Error finding container dbbbb1cdf12104ded8a5dc8079cd9ed20419ca448ce3d2ea2b577a1d2e9af547: Status 404 returned error can't find the container with id dbbbb1cdf12104ded8a5dc8079cd9ed20419ca448ce3d2ea2b577a1d2e9af547 Apr 21 06:29:50.832707 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.832674 2570 generic.go:358] "Generic (PLEG): container finished" podID="a301cbe9-f6d5-415b-9c49-98705de9960a" containerID="ad7b318d4e512d11177a31218cc5e0d53551f829696585ac7e47db87fe14d17f" exitCode=0 Apr 21 06:29:50.833158 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.832765 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a301cbe9-f6d5-415b-9c49-98705de9960a","Type":"ContainerDied","Data":"ad7b318d4e512d11177a31218cc5e0d53551f829696585ac7e47db87fe14d17f"} Apr 21 06:29:50.833158 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:50.832805 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a301cbe9-f6d5-415b-9c49-98705de9960a","Type":"ContainerStarted","Data":"dbbbb1cdf12104ded8a5dc8079cd9ed20419ca448ce3d2ea2b577a1d2e9af547"} Apr 21 06:29:51.084080 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:51.084017 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e934c6a-c383-4e26-9283-e31dd7b3c42c" path="/var/lib/kubelet/pods/6e934c6a-c383-4e26-9283-e31dd7b3c42c/volumes" Apr 21 06:29:51.843762 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:51.843725 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a301cbe9-f6d5-415b-9c49-98705de9960a","Type":"ContainerStarted","Data":"859e1973eca628f2d3464fcde7ddb669b9de41f170b1ccf0e19382a3db66449c"} Apr 21 06:29:51.843762 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:51.843759 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a301cbe9-f6d5-415b-9c49-98705de9960a","Type":"ContainerStarted","Data":"5f094291b4c132fe12fa8e05ce50a0513854edf2ec545a2fff2b09028bcdeb7b"} Apr 21 06:29:51.843762 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:51.843770 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a301cbe9-f6d5-415b-9c49-98705de9960a","Type":"ContainerStarted","Data":"25dfb75d2df4b763d532281e4f5c7ca03d6b5432c47d25dbeee5ca61795e824f"} Apr 21 06:29:51.844328 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:51.843779 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a301cbe9-f6d5-415b-9c49-98705de9960a","Type":"ContainerStarted","Data":"e590bd5df776a58db8aacaa9b7dd7c5256d46e92e15d5760f508321b85780387"} Apr 21 06:29:51.844328 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:51.843788 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a301cbe9-f6d5-415b-9c49-98705de9960a","Type":"ContainerStarted","Data":"8f08de2939d4440289b564e06e3db3fd97affb8e8254925de74677684e761408"} Apr 21 06:29:51.844328 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:51.843795 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a301cbe9-f6d5-415b-9c49-98705de9960a","Type":"ContainerStarted","Data":"55e9b7828f1f95a4eae05c949e052b3c2d7709dfdfa99c1957d4740b7be622b7"} Apr 21 06:29:51.869761 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:51.869711 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.869695402 podStartE2EDuration="2.869695402s" podCreationTimestamp="2026-04-21 06:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:29:51.867767824 +0000 UTC m=+229.374232760" watchObservedRunningTime="2026-04-21 06:29:51.869695402 +0000 UTC m=+229.376160314" Apr 21 06:29:55.200368 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:29:55.200318 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:30:50.200876 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:30:50.200818 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:30:50.216412 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:30:50.216386 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:30:51.030578 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:30:51.030552 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 06:31:02.959578 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:31:02.959549 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78qjr_4860a8de-8ebf-4c37-b025-9aaf165b999b/ovn-acl-logging/0.log" Apr 21 06:31:02.959578 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:31:02.959565 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78qjr_4860a8de-8ebf-4c37-b025-9aaf165b999b/ovn-acl-logging/0.log" Apr 21 06:31:02.966222 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:31:02.966202 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 06:34:48.992888 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:34:48.992777 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l"] Apr 21 06:34:48.995911 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:34:48.995890 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l" Apr 21 06:34:48.998142 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:34:48.998107 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fqc9p\"/\"kube-root-ca.crt\"" Apr 21 06:34:48.998257 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:34:48.998142 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fqc9p\"/\"openshift-service-ca.crt\"" Apr 21 06:34:48.998257 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:34:48.998232 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-fqc9p\"/\"default-dockercfg-nvv9m\"" Apr 21 06:34:49.003165 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:34:49.003040 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l"] Apr 21 06:34:49.074099 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:34:49.074065 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sv9z\" (UniqueName: \"kubernetes.io/projected/4f54019a-ea97-4ce8-9666-c2aa5691bb08-kube-api-access-7sv9z\") pod \"progression-enabled-node-0-0-c8s7l\" (UID: \"4f54019a-ea97-4ce8-9666-c2aa5691bb08\") " pod="rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l" Apr 21 06:34:49.175247 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:34:49.175189 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sv9z\" (UniqueName: \"kubernetes.io/projected/4f54019a-ea97-4ce8-9666-c2aa5691bb08-kube-api-access-7sv9z\") pod \"progression-enabled-node-0-0-c8s7l\" (UID: \"4f54019a-ea97-4ce8-9666-c2aa5691bb08\") " pod="rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l" Apr 21 06:34:49.182217 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:34:49.182188 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sv9z\" (UniqueName: \"kubernetes.io/projected/4f54019a-ea97-4ce8-9666-c2aa5691bb08-kube-api-access-7sv9z\") pod \"progression-enabled-node-0-0-c8s7l\" (UID: \"4f54019a-ea97-4ce8-9666-c2aa5691bb08\") " pod="rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l" Apr 21 06:34:49.307487 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:34:49.307399 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l" Apr 21 06:34:49.427417 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:34:49.427390 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l"] Apr 21 06:34:49.436586 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:34:49.436559 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 06:34:49.674239 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:34:49.674199 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l" event={"ID":"4f54019a-ea97-4ce8-9666-c2aa5691bb08","Type":"ContainerStarted","Data":"ef12a013b9110d70e4c2f78df9adda98a53b3bc4cc22ede3f032448c92715b75"} Apr 21 06:36:36.956035 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:36:36.956007 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78qjr_4860a8de-8ebf-4c37-b025-9aaf165b999b/ovn-acl-logging/0.log" Apr 21 06:36:36.956589 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:36:36.956172 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78qjr_4860a8de-8ebf-4c37-b025-9aaf165b999b/ovn-acl-logging/0.log" Apr 21 06:36:38.019447 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:36:38.019405 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l" event={"ID":"4f54019a-ea97-4ce8-9666-c2aa5691bb08","Type":"ContainerStarted","Data":"2b4ca921ba56f0ca81d7e6350abd3faf1846843cc4f83465f2c1a43ef09cc905"} Apr 21 06:36:38.019910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:36:38.019529 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l" Apr 21 06:36:38.039380 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:36:38.039328 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l" podStartSLOduration=1.925209212 podStartE2EDuration="1m50.039313031s" podCreationTimestamp="2026-04-21 06:34:48 +0000 UTC" firstStartedPulling="2026-04-21 06:34:49.43672842 +0000 UTC m=+526.943193312" lastFinishedPulling="2026-04-21 06:36:37.55083224 +0000 UTC m=+635.057297131" observedRunningTime="2026-04-21 06:36:38.038387753 +0000 UTC m=+635.544852668" watchObservedRunningTime="2026-04-21 06:36:38.039313031 +0000 UTC m=+635.545777959" Apr 21 06:36:39.022104 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:36:39.021989 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l" podUID="4f54019a-ea97-4ce8-9666-c2aa5691bb08" containerName="node" probeResult="failure" output="Get \"http://10.132.0.18:28080/metrics\": dial tcp 10.132.0.18:28080: connect: connection refused" Apr 21 06:36:39.197188 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:36:39.023065 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l" podUID="4f54019a-ea97-4ce8-9666-c2aa5691bb08" containerName="node" probeResult="failure" output="Get \"http://10.132.0.18:28080/metrics\": dial tcp 10.132.0.18:28080: connect: connection refused" Apr 21 06:36:40.024864 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:36:40.024827 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l" Apr 21 06:37:01.023910 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:01.023850 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l" podUID="4f54019a-ea97-4ce8-9666-c2aa5691bb08" containerName="node" probeResult="failure" output="Get \"http://10.132.0.18:28080/metrics\": dial tcp 10.132.0.18:28080: connect: connection refused" Apr 21 06:37:01.085152 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:01.085065 2570 generic.go:358] "Generic (PLEG): container finished" podID="4f54019a-ea97-4ce8-9666-c2aa5691bb08" containerID="2b4ca921ba56f0ca81d7e6350abd3faf1846843cc4f83465f2c1a43ef09cc905" exitCode=0 Apr 21 06:37:01.085152 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:01.085135 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l" event={"ID":"4f54019a-ea97-4ce8-9666-c2aa5691bb08","Type":"ContainerDied","Data":"2b4ca921ba56f0ca81d7e6350abd3faf1846843cc4f83465f2c1a43ef09cc905"} Apr 21 06:37:02.214018 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:02.213994 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l" Apr 21 06:37:02.342129 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:02.342044 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sv9z\" (UniqueName: \"kubernetes.io/projected/4f54019a-ea97-4ce8-9666-c2aa5691bb08-kube-api-access-7sv9z\") pod \"4f54019a-ea97-4ce8-9666-c2aa5691bb08\" (UID: \"4f54019a-ea97-4ce8-9666-c2aa5691bb08\") " Apr 21 06:37:02.344187 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:02.344159 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f54019a-ea97-4ce8-9666-c2aa5691bb08-kube-api-access-7sv9z" (OuterVolumeSpecName: "kube-api-access-7sv9z") pod "4f54019a-ea97-4ce8-9666-c2aa5691bb08" (UID: "4f54019a-ea97-4ce8-9666-c2aa5691bb08"). InnerVolumeSpecName "kube-api-access-7sv9z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:37:02.442971 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:02.442937 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7sv9z\" (UniqueName: \"kubernetes.io/projected/4f54019a-ea97-4ce8-9666-c2aa5691bb08-kube-api-access-7sv9z\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:37:03.090682 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:03.090660 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l" Apr 21 06:37:03.090802 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:03.090658 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l" event={"ID":"4f54019a-ea97-4ce8-9666-c2aa5691bb08","Type":"ContainerDied","Data":"ef12a013b9110d70e4c2f78df9adda98a53b3bc4cc22ede3f032448c92715b75"} Apr 21 06:37:03.090802 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:03.090769 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef12a013b9110d70e4c2f78df9adda98a53b3bc4cc22ede3f032448c92715b75" Apr 21 06:37:05.395974 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:05.395941 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb"] Apr 21 06:37:05.396359 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:05.396217 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f54019a-ea97-4ce8-9666-c2aa5691bb08" containerName="node" Apr 21 06:37:05.396359 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:05.396227 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f54019a-ea97-4ce8-9666-c2aa5691bb08" containerName="node" Apr 21 06:37:05.396359 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:05.396284 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f54019a-ea97-4ce8-9666-c2aa5691bb08" containerName="node" Apr 21 06:37:05.417306 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:05.417267 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb"] Apr 21 06:37:05.417460 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:05.417402 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb" Apr 21 06:37:05.419658 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:05.419632 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fqc9p\"/\"kube-root-ca.crt\"" Apr 21 06:37:05.419658 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:05.419641 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fqc9p\"/\"openshift-service-ca.crt\"" Apr 21 06:37:05.419658 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:05.419633 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-fqc9p\"/\"default-dockercfg-nvv9m\"" Apr 21 06:37:05.569419 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:05.569385 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-947tr\" (UniqueName: \"kubernetes.io/projected/0df03139-29a2-4048-b02c-c53bdabf8365-kube-api-access-947tr\") pod \"progression-disabled-node-0-0-9w5sb\" (UID: \"0df03139-29a2-4048-b02c-c53bdabf8365\") " pod="rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb" Apr 21 06:37:05.669961 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:05.669875 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-947tr\" (UniqueName: \"kubernetes.io/projected/0df03139-29a2-4048-b02c-c53bdabf8365-kube-api-access-947tr\") pod \"progression-disabled-node-0-0-9w5sb\" (UID: \"0df03139-29a2-4048-b02c-c53bdabf8365\") " pod="rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb" Apr 21 06:37:05.678210 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:05.678181 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-947tr\" (UniqueName: \"kubernetes.io/projected/0df03139-29a2-4048-b02c-c53bdabf8365-kube-api-access-947tr\") pod \"progression-disabled-node-0-0-9w5sb\" (UID: \"0df03139-29a2-4048-b02c-c53bdabf8365\") " pod="rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb" Apr 21 06:37:05.727138 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:05.727102 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb" Apr 21 06:37:05.846694 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:05.846669 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb"] Apr 21 06:37:05.849396 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:37:05.849370 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0df03139_29a2_4048_b02c_c53bdabf8365.slice/crio-3d0dd3bb4a185f1bbedd6ff0bcca29e3dcac7e905eeb0924f67069c8ab98aca8 WatchSource:0}: Error finding container 3d0dd3bb4a185f1bbedd6ff0bcca29e3dcac7e905eeb0924f67069c8ab98aca8: Status 404 returned error can't find the container with id 3d0dd3bb4a185f1bbedd6ff0bcca29e3dcac7e905eeb0924f67069c8ab98aca8 Apr 21 06:37:06.101845 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:06.101810 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb" event={"ID":"0df03139-29a2-4048-b02c-c53bdabf8365","Type":"ContainerStarted","Data":"b7fa17f9de26009813a87e9e5775c24ed3a3cd5e5767271ad5a9d36bb47b0069"} Apr 21 06:37:06.101845 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:06.101846 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb" event={"ID":"0df03139-29a2-4048-b02c-c53bdabf8365","Type":"ContainerStarted","Data":"3d0dd3bb4a185f1bbedd6ff0bcca29e3dcac7e905eeb0924f67069c8ab98aca8"} Apr 21 06:37:06.102074 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:06.101954 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb" Apr 21 06:37:06.118061 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:06.117982 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb" podStartSLOduration=1.117967593 podStartE2EDuration="1.117967593s" podCreationTimestamp="2026-04-21 06:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:37:06.116356057 +0000 UTC m=+663.622820969" watchObservedRunningTime="2026-04-21 06:37:06.117967593 +0000 UTC m=+663.624432505" Apr 21 06:37:07.104022 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:07.103985 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb" podUID="0df03139-29a2-4048-b02c-c53bdabf8365" containerName="node" probeResult="failure" output="Get \"http://10.132.0.19:28080/metrics\": dial tcp 10.132.0.19:28080: connect: connection refused" Apr 21 06:37:07.107896 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:07.107840 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb" podUID="0df03139-29a2-4048-b02c-c53bdabf8365" containerName="node" probeResult="failure" output="Get \"http://10.132.0.19:28080/metrics\": dial tcp 10.132.0.19:28080: connect: connection refused" Apr 21 06:37:08.109831 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:08.109801 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb" Apr 21 06:37:29.108705 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:29.108664 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb" podUID="0df03139-29a2-4048-b02c-c53bdabf8365" containerName="node" probeResult="failure" output="Get \"http://10.132.0.19:28080/metrics\": dial tcp 10.132.0.19:28080: connect: connection refused" Apr 21 06:37:29.171803 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:29.171754 2570 generic.go:358] "Generic (PLEG): container finished" podID="0df03139-29a2-4048-b02c-c53bdabf8365" containerID="b7fa17f9de26009813a87e9e5775c24ed3a3cd5e5767271ad5a9d36bb47b0069" exitCode=0 Apr 21 06:37:29.171975 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:29.171828 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb" event={"ID":"0df03139-29a2-4048-b02c-c53bdabf8365","Type":"ContainerDied","Data":"b7fa17f9de26009813a87e9e5775c24ed3a3cd5e5767271ad5a9d36bb47b0069"} Apr 21 06:37:30.302239 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:30.302214 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb" Apr 21 06:37:30.358306 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:30.358264 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-947tr\" (UniqueName: \"kubernetes.io/projected/0df03139-29a2-4048-b02c-c53bdabf8365-kube-api-access-947tr\") pod \"0df03139-29a2-4048-b02c-c53bdabf8365\" (UID: \"0df03139-29a2-4048-b02c-c53bdabf8365\") " Apr 21 06:37:30.360330 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:30.360305 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0df03139-29a2-4048-b02c-c53bdabf8365-kube-api-access-947tr" (OuterVolumeSpecName: "kube-api-access-947tr") pod "0df03139-29a2-4048-b02c-c53bdabf8365" (UID: "0df03139-29a2-4048-b02c-c53bdabf8365"). InnerVolumeSpecName "kube-api-access-947tr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:37:30.459033 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:30.458941 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-947tr\" (UniqueName: \"kubernetes.io/projected/0df03139-29a2-4048-b02c-c53bdabf8365-kube-api-access-947tr\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:37:31.178399 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:31.178362 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb" event={"ID":"0df03139-29a2-4048-b02c-c53bdabf8365","Type":"ContainerDied","Data":"3d0dd3bb4a185f1bbedd6ff0bcca29e3dcac7e905eeb0924f67069c8ab98aca8"} Apr 21 06:37:31.178399 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:31.178397 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d0dd3bb4a185f1bbedd6ff0bcca29e3dcac7e905eeb0924f67069c8ab98aca8" Apr 21 06:37:31.178647 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:31.178409 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb" Apr 21 06:37:40.397871 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:40.397828 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg"] Apr 21 06:37:40.398337 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:40.398136 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0df03139-29a2-4048-b02c-c53bdabf8365" containerName="node" Apr 21 06:37:40.398337 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:40.398148 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0df03139-29a2-4048-b02c-c53bdabf8365" containerName="node" Apr 21 06:37:40.398337 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:40.398195 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0df03139-29a2-4048-b02c-c53bdabf8365" containerName="node" Apr 21 06:37:40.400140 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:40.400125 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg" Apr 21 06:37:40.402187 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:40.402166 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fqc9p\"/\"kube-root-ca.crt\"" Apr 21 06:37:40.402325 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:40.402222 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-fqc9p\"/\"default-dockercfg-nvv9m\"" Apr 21 06:37:40.402325 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:40.402235 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fqc9p\"/\"openshift-service-ca.crt\"" Apr 21 06:37:40.408612 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:40.408593 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg"] Apr 21 06:37:40.438147 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:40.438117 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sqpq\" (UniqueName: \"kubernetes.io/projected/45ef1dc7-f6a7-41c2-ba73-1f0ee6404589-kube-api-access-9sqpq\") pod \"progression-invalid-node-0-0-qvjlg\" (UID: \"45ef1dc7-f6a7-41c2-ba73-1f0ee6404589\") " pod="rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg" Apr 21 06:37:40.539449 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:40.539398 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9sqpq\" (UniqueName: \"kubernetes.io/projected/45ef1dc7-f6a7-41c2-ba73-1f0ee6404589-kube-api-access-9sqpq\") pod \"progression-invalid-node-0-0-qvjlg\" (UID: \"45ef1dc7-f6a7-41c2-ba73-1f0ee6404589\") " pod="rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg" Apr 21 06:37:40.547405 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:40.547372 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sqpq\" (UniqueName: \"kubernetes.io/projected/45ef1dc7-f6a7-41c2-ba73-1f0ee6404589-kube-api-access-9sqpq\") pod \"progression-invalid-node-0-0-qvjlg\" (UID: \"45ef1dc7-f6a7-41c2-ba73-1f0ee6404589\") " pod="rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg" Apr 21 06:37:40.710612 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:40.710515 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg" Apr 21 06:37:40.827007 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:40.826976 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg"] Apr 21 06:37:40.830276 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:37:40.830249 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45ef1dc7_f6a7_41c2_ba73_1f0ee6404589.slice/crio-0c9e58748097914cd259f43329af919f37595e9169c923237b4729ecf14e0454 WatchSource:0}: Error finding container 0c9e58748097914cd259f43329af919f37595e9169c923237b4729ecf14e0454: Status 404 returned error can't find the container with id 0c9e58748097914cd259f43329af919f37595e9169c923237b4729ecf14e0454 Apr 21 06:37:41.210709 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:41.210671 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg" event={"ID":"45ef1dc7-f6a7-41c2-ba73-1f0ee6404589","Type":"ContainerStarted","Data":"f441c166e88b89b5ff358babf9357a41e462d7493afd959924c20715a04e763f"} Apr 21 06:37:41.210905 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:41.210715 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg" event={"ID":"45ef1dc7-f6a7-41c2-ba73-1f0ee6404589","Type":"ContainerStarted","Data":"0c9e58748097914cd259f43329af919f37595e9169c923237b4729ecf14e0454"} Apr 21 06:37:41.210905 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:41.210800 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg" Apr 21 06:37:41.226547 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:41.226501 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg" podStartSLOduration=1.226488133 podStartE2EDuration="1.226488133s" podCreationTimestamp="2026-04-21 06:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:37:41.225083565 +0000 UTC m=+698.731548479" watchObservedRunningTime="2026-04-21 06:37:41.226488133 +0000 UTC m=+698.732953046" Apr 21 06:37:42.212734 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:42.212689 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg" podUID="45ef1dc7-f6a7-41c2-ba73-1f0ee6404589" containerName="node" probeResult="failure" output="Get \"http://10.132.0.20:28080/metrics\": dial tcp 10.132.0.20:28080: connect: connection refused" Apr 21 06:37:42.214046 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:42.214014 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg" podUID="45ef1dc7-f6a7-41c2-ba73-1f0ee6404589" containerName="node" probeResult="failure" output="Get \"http://10.132.0.20:28080/metrics\": dial tcp 10.132.0.20:28080: connect: connection refused" Apr 21 06:37:43.215383 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:37:43.215351 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg" Apr 21 06:38:04.214381 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:38:04.214338 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg" podUID="45ef1dc7-f6a7-41c2-ba73-1f0ee6404589" containerName="node" probeResult="failure" output="Get \"http://10.132.0.20:28080/metrics\": dial tcp 10.132.0.20:28080: connect: connection refused" Apr 21 06:38:04.272827 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:38:04.272738 2570 generic.go:358] "Generic (PLEG): container finished" podID="45ef1dc7-f6a7-41c2-ba73-1f0ee6404589" containerID="f441c166e88b89b5ff358babf9357a41e462d7493afd959924c20715a04e763f" exitCode=0 Apr 21 06:38:04.272827 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:38:04.272813 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg" event={"ID":"45ef1dc7-f6a7-41c2-ba73-1f0ee6404589","Type":"ContainerDied","Data":"f441c166e88b89b5ff358babf9357a41e462d7493afd959924c20715a04e763f"} Apr 21 06:38:05.390666 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:38:05.390641 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg" Apr 21 06:38:05.539079 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:38:05.538990 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sqpq\" (UniqueName: \"kubernetes.io/projected/45ef1dc7-f6a7-41c2-ba73-1f0ee6404589-kube-api-access-9sqpq\") pod \"45ef1dc7-f6a7-41c2-ba73-1f0ee6404589\" (UID: \"45ef1dc7-f6a7-41c2-ba73-1f0ee6404589\") " Apr 21 06:38:05.541090 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:38:05.541056 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45ef1dc7-f6a7-41c2-ba73-1f0ee6404589-kube-api-access-9sqpq" (OuterVolumeSpecName: "kube-api-access-9sqpq") pod "45ef1dc7-f6a7-41c2-ba73-1f0ee6404589" (UID: "45ef1dc7-f6a7-41c2-ba73-1f0ee6404589"). InnerVolumeSpecName "kube-api-access-9sqpq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:38:05.639942 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:38:05.639910 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9sqpq\" (UniqueName: \"kubernetes.io/projected/45ef1dc7-f6a7-41c2-ba73-1f0ee6404589-kube-api-access-9sqpq\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:38:06.279610 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:38:06.279586 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg" Apr 21 06:38:06.279785 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:38:06.279583 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg" event={"ID":"45ef1dc7-f6a7-41c2-ba73-1f0ee6404589","Type":"ContainerDied","Data":"0c9e58748097914cd259f43329af919f37595e9169c923237b4729ecf14e0454"} Apr 21 06:38:06.279785 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:38:06.279699 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c9e58748097914cd259f43329af919f37595e9169c923237b4729ecf14e0454" Apr 21 06:39:55.998519 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:39:55.998486 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-fqc9p/progression-no-metrics-node-0-0-m628d"] Apr 21 06:39:55.999004 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:39:55.998799 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45ef1dc7-f6a7-41c2-ba73-1f0ee6404589" containerName="node" Apr 21 06:39:55.999004 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:39:55.998811 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ef1dc7-f6a7-41c2-ba73-1f0ee6404589" containerName="node" Apr 21 06:39:55.999004 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:39:55.998884 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="45ef1dc7-f6a7-41c2-ba73-1f0ee6404589" containerName="node" Apr 21 06:39:56.000622 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:39:56.000607 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-no-metrics-node-0-0-m628d" Apr 21 06:39:56.005171 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:39:56.005135 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fqc9p\"/\"openshift-service-ca.crt\"" Apr 21 06:39:56.005171 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:39:56.005158 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fqc9p\"/\"kube-root-ca.crt\"" Apr 21 06:39:56.005362 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:39:56.005182 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-fqc9p\"/\"default-dockercfg-nvv9m\"" Apr 21 06:39:56.010000 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:39:56.009976 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-no-metrics-node-0-0-m628d"] Apr 21 06:39:56.096237 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:39:56.096203 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngw9j\" (UniqueName: \"kubernetes.io/projected/f26624ac-f537-4018-b0f5-4f4c0d3c5513-kube-api-access-ngw9j\") pod \"progression-no-metrics-node-0-0-m628d\" (UID: \"f26624ac-f537-4018-b0f5-4f4c0d3c5513\") " pod="rhai-e2e-progression-fqc9p/progression-no-metrics-node-0-0-m628d" Apr 21 06:39:56.197625 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:39:56.197591 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngw9j\" (UniqueName: \"kubernetes.io/projected/f26624ac-f537-4018-b0f5-4f4c0d3c5513-kube-api-access-ngw9j\") pod \"progression-no-metrics-node-0-0-m628d\" (UID: \"f26624ac-f537-4018-b0f5-4f4c0d3c5513\") " pod="rhai-e2e-progression-fqc9p/progression-no-metrics-node-0-0-m628d" Apr 21 06:39:56.204704 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:39:56.204679 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngw9j\" (UniqueName: \"kubernetes.io/projected/f26624ac-f537-4018-b0f5-4f4c0d3c5513-kube-api-access-ngw9j\") pod \"progression-no-metrics-node-0-0-m628d\" (UID: \"f26624ac-f537-4018-b0f5-4f4c0d3c5513\") " pod="rhai-e2e-progression-fqc9p/progression-no-metrics-node-0-0-m628d" Apr 21 06:39:56.309790 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:39:56.309705 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-no-metrics-node-0-0-m628d" Apr 21 06:39:56.426204 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:39:56.426182 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-no-metrics-node-0-0-m628d"] Apr 21 06:39:56.428234 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:39:56.428207 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf26624ac_f537_4018_b0f5_4f4c0d3c5513.slice/crio-3c97e276febdee8a941fe3a3e944d0ab9485df877468b62dcab4cf80c60902ea WatchSource:0}: Error finding container 3c97e276febdee8a941fe3a3e944d0ab9485df877468b62dcab4cf80c60902ea: Status 404 returned error can't find the container with id 3c97e276febdee8a941fe3a3e944d0ab9485df877468b62dcab4cf80c60902ea Apr 21 06:39:56.430220 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:39:56.430201 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 06:39:56.608238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:39:56.608149 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-no-metrics-node-0-0-m628d" event={"ID":"f26624ac-f537-4018-b0f5-4f4c0d3c5513","Type":"ContainerStarted","Data":"6b2c92afce0935a02af08ba672f0407a06b16a69b4913925130e9497f73fead7"} Apr 21 06:39:56.608238 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:39:56.608193 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-no-metrics-node-0-0-m628d" event={"ID":"f26624ac-f537-4018-b0f5-4f4c0d3c5513","Type":"ContainerStarted","Data":"3c97e276febdee8a941fe3a3e944d0ab9485df877468b62dcab4cf80c60902ea"} Apr 21 06:39:56.623474 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:39:56.623433 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-fqc9p/progression-no-metrics-node-0-0-m628d" podStartSLOduration=1.6234206009999999 podStartE2EDuration="1.623420601s" podCreationTimestamp="2026-04-21 06:39:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:39:56.621413344 +0000 UTC m=+834.127878257" watchObservedRunningTime="2026-04-21 06:39:56.623420601 +0000 UTC m=+834.129885586" Apr 21 06:40:01.633592 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:01.633559 2570 generic.go:358] "Generic (PLEG): container finished" podID="f26624ac-f537-4018-b0f5-4f4c0d3c5513" containerID="6b2c92afce0935a02af08ba672f0407a06b16a69b4913925130e9497f73fead7" exitCode=0 Apr 21 06:40:01.634069 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:01.633610 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-no-metrics-node-0-0-m628d" event={"ID":"f26624ac-f537-4018-b0f5-4f4c0d3c5513","Type":"ContainerDied","Data":"6b2c92afce0935a02af08ba672f0407a06b16a69b4913925130e9497f73fead7"} Apr 21 06:40:02.755462 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:02.755440 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-no-metrics-node-0-0-m628d" Apr 21 06:40:02.852900 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:02.852843 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngw9j\" (UniqueName: \"kubernetes.io/projected/f26624ac-f537-4018-b0f5-4f4c0d3c5513-kube-api-access-ngw9j\") pod \"f26624ac-f537-4018-b0f5-4f4c0d3c5513\" (UID: \"f26624ac-f537-4018-b0f5-4f4c0d3c5513\") " Apr 21 06:40:02.854813 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:02.854782 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f26624ac-f537-4018-b0f5-4f4c0d3c5513-kube-api-access-ngw9j" (OuterVolumeSpecName: "kube-api-access-ngw9j") pod "f26624ac-f537-4018-b0f5-4f4c0d3c5513" (UID: "f26624ac-f537-4018-b0f5-4f4c0d3c5513"). InnerVolumeSpecName "kube-api-access-ngw9j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:40:02.954395 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:02.954311 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ngw9j\" (UniqueName: \"kubernetes.io/projected/f26624ac-f537-4018-b0f5-4f4c0d3c5513-kube-api-access-ngw9j\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:40:03.641158 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:03.641123 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-no-metrics-node-0-0-m628d" event={"ID":"f26624ac-f537-4018-b0f5-4f4c0d3c5513","Type":"ContainerDied","Data":"3c97e276febdee8a941fe3a3e944d0ab9485df877468b62dcab4cf80c60902ea"} Apr 21 06:40:03.641158 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:03.641162 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c97e276febdee8a941fe3a3e944d0ab9485df877468b62dcab4cf80c60902ea" Apr 21 06:40:03.641401 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:03.641134 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-no-metrics-node-0-0-m628d" Apr 21 06:40:08.825392 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:08.825358 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-56jvx/must-gather-p6fpk"] Apr 21 06:40:08.825759 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:08.825659 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f26624ac-f537-4018-b0f5-4f4c0d3c5513" containerName="node" Apr 21 06:40:08.825759 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:08.825670 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26624ac-f537-4018-b0f5-4f4c0d3c5513" containerName="node" Apr 21 06:40:08.825759 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:08.825716 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="f26624ac-f537-4018-b0f5-4f4c0d3c5513" containerName="node" Apr 21 06:40:08.827748 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:08.827732 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-56jvx/must-gather-p6fpk" Apr 21 06:40:08.829927 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:08.829895 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-56jvx\"/\"default-dockercfg-b2qbt\"" Apr 21 06:40:08.829927 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:08.829902 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-56jvx\"/\"openshift-service-ca.crt\"" Apr 21 06:40:08.830121 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:08.829952 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-56jvx\"/\"kube-root-ca.crt\"" Apr 21 06:40:08.838095 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:08.838072 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-56jvx/must-gather-p6fpk"] Apr 21 06:40:08.902517 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:08.902484 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7382fdb-cb73-40ee-bc76-f2411bb94caf-must-gather-output\") pod \"must-gather-p6fpk\" (UID: \"a7382fdb-cb73-40ee-bc76-f2411bb94caf\") " pod="openshift-must-gather-56jvx/must-gather-p6fpk" Apr 21 06:40:08.902674 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:08.902528 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-589lc\" (UniqueName: \"kubernetes.io/projected/a7382fdb-cb73-40ee-bc76-f2411bb94caf-kube-api-access-589lc\") pod \"must-gather-p6fpk\" (UID: \"a7382fdb-cb73-40ee-bc76-f2411bb94caf\") " pod="openshift-must-gather-56jvx/must-gather-p6fpk" Apr 21 06:40:09.003035 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:09.003005 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7382fdb-cb73-40ee-bc76-f2411bb94caf-must-gather-output\") pod \"must-gather-p6fpk\" (UID: \"a7382fdb-cb73-40ee-bc76-f2411bb94caf\") " pod="openshift-must-gather-56jvx/must-gather-p6fpk" Apr 21 06:40:09.003207 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:09.003045 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-589lc\" (UniqueName: \"kubernetes.io/projected/a7382fdb-cb73-40ee-bc76-f2411bb94caf-kube-api-access-589lc\") pod \"must-gather-p6fpk\" (UID: \"a7382fdb-cb73-40ee-bc76-f2411bb94caf\") " pod="openshift-must-gather-56jvx/must-gather-p6fpk" Apr 21 06:40:09.003340 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:09.003322 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7382fdb-cb73-40ee-bc76-f2411bb94caf-must-gather-output\") pod \"must-gather-p6fpk\" (UID: \"a7382fdb-cb73-40ee-bc76-f2411bb94caf\") " pod="openshift-must-gather-56jvx/must-gather-p6fpk" Apr 21 06:40:09.010198 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:09.010177 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-589lc\" (UniqueName: \"kubernetes.io/projected/a7382fdb-cb73-40ee-bc76-f2411bb94caf-kube-api-access-589lc\") pod \"must-gather-p6fpk\" (UID: \"a7382fdb-cb73-40ee-bc76-f2411bb94caf\") " pod="openshift-must-gather-56jvx/must-gather-p6fpk" Apr 21 06:40:09.136742 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:09.136717 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-56jvx/must-gather-p6fpk" Apr 21 06:40:09.254164 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:09.254134 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-56jvx/must-gather-p6fpk"] Apr 21 06:40:09.257190 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:40:09.257148 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7382fdb_cb73_40ee_bc76_f2411bb94caf.slice/crio-dea4f1def6c8b8ebe30d6fde12a1b40712f7dd08d9665071b66aa703f796a803 WatchSource:0}: Error finding container dea4f1def6c8b8ebe30d6fde12a1b40712f7dd08d9665071b66aa703f796a803: Status 404 returned error can't find the container with id dea4f1def6c8b8ebe30d6fde12a1b40712f7dd08d9665071b66aa703f796a803 Apr 21 06:40:09.660301 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:09.660268 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-56jvx/must-gather-p6fpk" event={"ID":"a7382fdb-cb73-40ee-bc76-f2411bb94caf","Type":"ContainerStarted","Data":"dea4f1def6c8b8ebe30d6fde12a1b40712f7dd08d9665071b66aa703f796a803"} Apr 21 06:40:12.973475 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:12.973433 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb"] Apr 21 06:40:12.974714 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:12.974688 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-disabled-node-0-0-9w5sb"] Apr 21 06:40:12.979545 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:12.979477 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l"] Apr 21 06:40:12.981324 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:12.981300 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-enabled-node-0-0-c8s7l"] Apr 21 06:40:12.985637 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:12.985602 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg"] Apr 21 06:40:12.987057 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:12.987038 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-invalid-node-0-0-qvjlg"] Apr 21 06:40:13.002313 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:13.002283 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-no-metrics-node-0-0-m628d"] Apr 21 06:40:13.004028 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:13.004009 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-no-metrics-node-0-0-m628d"] Apr 21 06:40:13.089450 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:13.089416 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0df03139-29a2-4048-b02c-c53bdabf8365" path="/var/lib/kubelet/pods/0df03139-29a2-4048-b02c-c53bdabf8365/volumes" Apr 21 06:40:13.089958 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:13.089929 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45ef1dc7-f6a7-41c2-ba73-1f0ee6404589" path="/var/lib/kubelet/pods/45ef1dc7-f6a7-41c2-ba73-1f0ee6404589/volumes" Apr 21 06:40:13.090358 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:13.090336 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f54019a-ea97-4ce8-9666-c2aa5691bb08" path="/var/lib/kubelet/pods/4f54019a-ea97-4ce8-9666-c2aa5691bb08/volumes" Apr 21 06:40:13.090819 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:13.090794 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f26624ac-f537-4018-b0f5-4f4c0d3c5513" path="/var/lib/kubelet/pods/f26624ac-f537-4018-b0f5-4f4c0d3c5513/volumes" Apr 21 06:40:14.678080 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:14.678046 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-56jvx/must-gather-p6fpk" event={"ID":"a7382fdb-cb73-40ee-bc76-f2411bb94caf","Type":"ContainerStarted","Data":"d2d83ac16c13239b0a4a726da7d3faa975957aeb48ad1f0ebb79c77d6eb1b110"} Apr 21 06:40:14.678456 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:14.678088 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-56jvx/must-gather-p6fpk" event={"ID":"a7382fdb-cb73-40ee-bc76-f2411bb94caf","Type":"ContainerStarted","Data":"55095abcf8750b4c3a30b2cd13361755cfcda8743e4e8192b6008406258e6036"} Apr 21 06:40:14.691983 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:14.691927 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-56jvx/must-gather-p6fpk" podStartSLOduration=1.580096401 podStartE2EDuration="6.691908337s" podCreationTimestamp="2026-04-21 06:40:08 +0000 UTC" firstStartedPulling="2026-04-21 06:40:09.258793075 +0000 UTC m=+846.765257970" lastFinishedPulling="2026-04-21 06:40:14.370605015 +0000 UTC m=+851.877069906" observedRunningTime="2026-04-21 06:40:14.691095049 +0000 UTC m=+852.197559963" watchObservedRunningTime="2026-04-21 06:40:14.691908337 +0000 UTC m=+852.198373250" Apr 21 06:40:59.832350 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:59.832311 2570 generic.go:358] "Generic (PLEG): container finished" podID="a7382fdb-cb73-40ee-bc76-f2411bb94caf" containerID="55095abcf8750b4c3a30b2cd13361755cfcda8743e4e8192b6008406258e6036" exitCode=0 Apr 21 06:40:59.832746 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:59.832364 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-56jvx/must-gather-p6fpk" event={"ID":"a7382fdb-cb73-40ee-bc76-f2411bb94caf","Type":"ContainerDied","Data":"55095abcf8750b4c3a30b2cd13361755cfcda8743e4e8192b6008406258e6036"} Apr 21 06:40:59.832746 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:40:59.832710 2570 scope.go:117] "RemoveContainer" containerID="55095abcf8750b4c3a30b2cd13361755cfcda8743e4e8192b6008406258e6036" Apr 21 06:41:00.414457 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:00.414426 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-56jvx_must-gather-p6fpk_a7382fdb-cb73-40ee-bc76-f2411bb94caf/gather/0.log" Apr 21 06:41:03.500568 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:03.500534 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-frtp9_f9a14c21-e359-4c20-95a5-948922cc3ff8/global-pull-secret-syncer/0.log" Apr 21 06:41:03.702380 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:03.702350 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-gh2h2_dec8e610-c059-4e43-8e86-18a73c970319/konnectivity-agent/0.log" Apr 21 06:41:03.805704 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:03.805621 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-68.ec2.internal_9475ce23d467a37e0480df7597bbc574/haproxy/0.log" Apr 21 06:41:05.887965 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:05.887931 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-56jvx/must-gather-p6fpk"] Apr 21 06:41:05.888763 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:05.888156 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-56jvx/must-gather-p6fpk" podUID="a7382fdb-cb73-40ee-bc76-f2411bb94caf" containerName="copy" containerID="cri-o://d2d83ac16c13239b0a4a726da7d3faa975957aeb48ad1f0ebb79c77d6eb1b110" gracePeriod=2 Apr 21 06:41:05.890113 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:05.890069 2570 status_manager.go:895] "Failed to get status for pod" podUID="a7382fdb-cb73-40ee-bc76-f2411bb94caf" pod="openshift-must-gather-56jvx/must-gather-p6fpk" err="pods \"must-gather-p6fpk\" is forbidden: User \"system:node:ip-10-0-138-68.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-56jvx\": no relationship found between node 'ip-10-0-138-68.ec2.internal' and this object" Apr 21 06:41:05.890257 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:05.890192 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-56jvx/must-gather-p6fpk"] Apr 21 06:41:06.120374 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.120353 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-56jvx_must-gather-p6fpk_a7382fdb-cb73-40ee-bc76-f2411bb94caf/copy/0.log" Apr 21 06:41:06.120702 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.120687 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-56jvx/must-gather-p6fpk" Apr 21 06:41:06.122602 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.122581 2570 status_manager.go:895] "Failed to get status for pod" podUID="a7382fdb-cb73-40ee-bc76-f2411bb94caf" pod="openshift-must-gather-56jvx/must-gather-p6fpk" err="pods \"must-gather-p6fpk\" is forbidden: User \"system:node:ip-10-0-138-68.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-56jvx\": no relationship found between node 'ip-10-0-138-68.ec2.internal' and this object" Apr 21 06:41:06.189104 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.189030 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7382fdb-cb73-40ee-bc76-f2411bb94caf-must-gather-output\") pod \"a7382fdb-cb73-40ee-bc76-f2411bb94caf\" (UID: \"a7382fdb-cb73-40ee-bc76-f2411bb94caf\") " Apr 21 06:41:06.189104 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.189098 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-589lc\" (UniqueName: \"kubernetes.io/projected/a7382fdb-cb73-40ee-bc76-f2411bb94caf-kube-api-access-589lc\") pod \"a7382fdb-cb73-40ee-bc76-f2411bb94caf\" (UID: \"a7382fdb-cb73-40ee-bc76-f2411bb94caf\") " Apr 21 06:41:06.190554 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.190528 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7382fdb-cb73-40ee-bc76-f2411bb94caf-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a7382fdb-cb73-40ee-bc76-f2411bb94caf" (UID: "a7382fdb-cb73-40ee-bc76-f2411bb94caf"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 06:41:06.191305 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.191269 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7382fdb-cb73-40ee-bc76-f2411bb94caf-kube-api-access-589lc" (OuterVolumeSpecName: "kube-api-access-589lc") pod "a7382fdb-cb73-40ee-bc76-f2411bb94caf" (UID: "a7382fdb-cb73-40ee-bc76-f2411bb94caf"). InnerVolumeSpecName "kube-api-access-589lc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:41:06.289827 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.289788 2570 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a7382fdb-cb73-40ee-bc76-f2411bb94caf-must-gather-output\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:41:06.289827 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.289818 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-589lc\" (UniqueName: \"kubernetes.io/projected/a7382fdb-cb73-40ee-bc76-f2411bb94caf-kube-api-access-589lc\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 21 06:41:06.626211 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.626184 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_48a4461d-42cd-4b0b-85a1-7553ac766967/alertmanager/0.log" Apr 21 06:41:06.657965 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.657931 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_48a4461d-42cd-4b0b-85a1-7553ac766967/config-reloader/0.log" Apr 21 06:41:06.680511 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.680488 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_48a4461d-42cd-4b0b-85a1-7553ac766967/kube-rbac-proxy-web/0.log" Apr 21 06:41:06.709373 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.709344 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_48a4461d-42cd-4b0b-85a1-7553ac766967/kube-rbac-proxy/0.log" Apr 21 06:41:06.741697 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.741665 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_48a4461d-42cd-4b0b-85a1-7553ac766967/kube-rbac-proxy-metric/0.log" Apr 21 06:41:06.767865 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.767828 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_48a4461d-42cd-4b0b-85a1-7553ac766967/prom-label-proxy/0.log" Apr 21 06:41:06.793725 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.793698 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_48a4461d-42cd-4b0b-85a1-7553ac766967/init-config-reloader/0.log" Apr 21 06:41:06.853725 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.853700 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-56jvx_must-gather-p6fpk_a7382fdb-cb73-40ee-bc76-f2411bb94caf/copy/0.log" Apr 21 06:41:06.854058 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.854035 2570 generic.go:358] "Generic (PLEG): container finished" podID="a7382fdb-cb73-40ee-bc76-f2411bb94caf" containerID="d2d83ac16c13239b0a4a726da7d3faa975957aeb48ad1f0ebb79c77d6eb1b110" exitCode=143 Apr 21 06:41:06.854153 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.854092 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-56jvx/must-gather-p6fpk" Apr 21 06:41:06.854153 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.854105 2570 scope.go:117] "RemoveContainer" containerID="d2d83ac16c13239b0a4a726da7d3faa975957aeb48ad1f0ebb79c77d6eb1b110" Apr 21 06:41:06.856054 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.856028 2570 status_manager.go:895] "Failed to get status for pod" podUID="a7382fdb-cb73-40ee-bc76-f2411bb94caf" pod="openshift-must-gather-56jvx/must-gather-p6fpk" err="pods \"must-gather-p6fpk\" is forbidden: User \"system:node:ip-10-0-138-68.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-56jvx\": no relationship found between node 'ip-10-0-138-68.ec2.internal' and this object" Apr 21 06:41:06.862818 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.862798 2570 scope.go:117] "RemoveContainer" containerID="55095abcf8750b4c3a30b2cd13361755cfcda8743e4e8192b6008406258e6036" Apr 21 06:41:06.866942 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.866911 2570 status_manager.go:895] "Failed to get status for pod" podUID="a7382fdb-cb73-40ee-bc76-f2411bb94caf" pod="openshift-must-gather-56jvx/must-gather-p6fpk" err="pods \"must-gather-p6fpk\" is forbidden: User \"system:node:ip-10-0-138-68.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-56jvx\": no relationship found between node 'ip-10-0-138-68.ec2.internal' and this object" Apr 21 06:41:06.875475 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.875459 2570 scope.go:117] "RemoveContainer" containerID="d2d83ac16c13239b0a4a726da7d3faa975957aeb48ad1f0ebb79c77d6eb1b110" Apr 21 06:41:06.875686 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:41:06.875666 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2d83ac16c13239b0a4a726da7d3faa975957aeb48ad1f0ebb79c77d6eb1b110\": container with ID starting with d2d83ac16c13239b0a4a726da7d3faa975957aeb48ad1f0ebb79c77d6eb1b110 not found: ID does not exist" containerID="d2d83ac16c13239b0a4a726da7d3faa975957aeb48ad1f0ebb79c77d6eb1b110" Apr 21 06:41:06.875742 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.875700 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2d83ac16c13239b0a4a726da7d3faa975957aeb48ad1f0ebb79c77d6eb1b110"} err="failed to get container status \"d2d83ac16c13239b0a4a726da7d3faa975957aeb48ad1f0ebb79c77d6eb1b110\": rpc error: code = NotFound desc = could not find container \"d2d83ac16c13239b0a4a726da7d3faa975957aeb48ad1f0ebb79c77d6eb1b110\": container with ID starting with d2d83ac16c13239b0a4a726da7d3faa975957aeb48ad1f0ebb79c77d6eb1b110 not found: ID does not exist" Apr 21 06:41:06.875742 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.875722 2570 scope.go:117] "RemoveContainer" containerID="55095abcf8750b4c3a30b2cd13361755cfcda8743e4e8192b6008406258e6036" Apr 21 06:41:06.876030 ip-10-0-138-68 kubenswrapper[2570]: E0421 06:41:06.876012 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55095abcf8750b4c3a30b2cd13361755cfcda8743e4e8192b6008406258e6036\": container with ID starting with 55095abcf8750b4c3a30b2cd13361755cfcda8743e4e8192b6008406258e6036 not found: ID does not exist" containerID="55095abcf8750b4c3a30b2cd13361755cfcda8743e4e8192b6008406258e6036" Apr 21 06:41:06.876095 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.876034 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55095abcf8750b4c3a30b2cd13361755cfcda8743e4e8192b6008406258e6036"} err="failed to get container status \"55095abcf8750b4c3a30b2cd13361755cfcda8743e4e8192b6008406258e6036\": rpc error: code = NotFound desc = could not find container \"55095abcf8750b4c3a30b2cd13361755cfcda8743e4e8192b6008406258e6036\": container with ID starting with 55095abcf8750b4c3a30b2cd13361755cfcda8743e4e8192b6008406258e6036 not found: ID does not exist" Apr 21 06:41:06.958280 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:06.958213 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-mrrfx_4d3c5ba6-9729-40ec-8881-cff62bfb8bb3/monitoring-plugin/0.log" Apr 21 06:41:07.083285 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:07.083255 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7382fdb-cb73-40ee-bc76-f2411bb94caf" path="/var/lib/kubelet/pods/a7382fdb-cb73-40ee-bc76-f2411bb94caf/volumes" Apr 21 06:41:07.136371 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:07.136345 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qqfpq_0046ecc4-fbf2-441f-a0a9-c7b79d713ced/node-exporter/0.log" Apr 21 06:41:07.163736 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:07.163708 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qqfpq_0046ecc4-fbf2-441f-a0a9-c7b79d713ced/kube-rbac-proxy/0.log" Apr 21 06:41:07.187337 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:07.187311 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qqfpq_0046ecc4-fbf2-441f-a0a9-c7b79d713ced/init-textfile/0.log" Apr 21 06:41:07.292557 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:07.292487 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a301cbe9-f6d5-415b-9c49-98705de9960a/prometheus/0.log" Apr 21 06:41:07.313611 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:07.313586 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a301cbe9-f6d5-415b-9c49-98705de9960a/config-reloader/0.log" Apr 21 06:41:07.335210 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:07.335183 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a301cbe9-f6d5-415b-9c49-98705de9960a/thanos-sidecar/0.log" Apr 21 06:41:07.360051 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:07.360024 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a301cbe9-f6d5-415b-9c49-98705de9960a/kube-rbac-proxy-web/0.log" Apr 21 06:41:07.381814 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:07.381786 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a301cbe9-f6d5-415b-9c49-98705de9960a/kube-rbac-proxy/0.log" Apr 21 06:41:07.402737 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:07.402719 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a301cbe9-f6d5-415b-9c49-98705de9960a/kube-rbac-proxy-thanos/0.log" Apr 21 06:41:07.430161 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:07.430133 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a301cbe9-f6d5-415b-9c49-98705de9960a/init-config-reloader/0.log" Apr 21 06:41:07.508408 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:07.508374 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-zz8fg_908bb8a3-b3fd-4ad3-8194-5f7d51de620d/prometheus-operator-admission-webhook/0.log" Apr 21 06:41:07.610353 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:07.610278 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-959989d7d-v9t94_3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b/thanos-query/0.log" Apr 21 06:41:07.635355 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:07.635323 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-959989d7d-v9t94_3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b/kube-rbac-proxy-web/0.log" Apr 21 06:41:07.655960 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:07.655938 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-959989d7d-v9t94_3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b/kube-rbac-proxy/0.log" Apr 21 06:41:07.680030 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:07.680002 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-959989d7d-v9t94_3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b/prom-label-proxy/0.log" Apr 21 06:41:07.701590 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:07.701564 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-959989d7d-v9t94_3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b/kube-rbac-proxy-rules/0.log" Apr 21 06:41:07.722972 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:07.722947 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-959989d7d-v9t94_3c03c2ab-3a81-4b9f-9a90-e4ef5d38303b/kube-rbac-proxy-metrics/0.log" Apr 21 06:41:10.566506 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.566474 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875"] Apr 21 06:41:10.566900 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.566775 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7382fdb-cb73-40ee-bc76-f2411bb94caf" containerName="copy" Apr 21 06:41:10.566900 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.566785 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7382fdb-cb73-40ee-bc76-f2411bb94caf" containerName="copy" Apr 21 06:41:10.566900 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.566799 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7382fdb-cb73-40ee-bc76-f2411bb94caf" containerName="gather" Apr 21 06:41:10.566900 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.566805 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7382fdb-cb73-40ee-bc76-f2411bb94caf" containerName="gather" Apr 21 06:41:10.566900 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.566874 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7382fdb-cb73-40ee-bc76-f2411bb94caf" containerName="copy" Apr 21 06:41:10.566900 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.566884 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7382fdb-cb73-40ee-bc76-f2411bb94caf" containerName="gather" Apr 21 06:41:10.572220 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.572198 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:10.576576 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.576550 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-lwwlk\"/\"default-dockercfg-2tfc9\"" Apr 21 06:41:10.576707 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.576576 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lwwlk\"/\"kube-root-ca.crt\"" Apr 21 06:41:10.576707 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.576550 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lwwlk\"/\"openshift-service-ca.crt\"" Apr 21 06:41:10.579330 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.579130 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875"] Apr 21 06:41:10.623108 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.623070 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zslkj\" (UniqueName: \"kubernetes.io/projected/8923356e-911a-400b-896b-9c0b05aaccb1-kube-api-access-zslkj\") pod \"perf-node-gather-daemonset-2w875\" (UID: \"8923356e-911a-400b-896b-9c0b05aaccb1\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:10.623282 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.623130 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8923356e-911a-400b-896b-9c0b05aaccb1-proc\") pod \"perf-node-gather-daemonset-2w875\" (UID: \"8923356e-911a-400b-896b-9c0b05aaccb1\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:10.623282 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.623207 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8923356e-911a-400b-896b-9c0b05aaccb1-podres\") pod \"perf-node-gather-daemonset-2w875\" (UID: \"8923356e-911a-400b-896b-9c0b05aaccb1\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:10.623282 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.623252 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8923356e-911a-400b-896b-9c0b05aaccb1-sys\") pod \"perf-node-gather-daemonset-2w875\" (UID: \"8923356e-911a-400b-896b-9c0b05aaccb1\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:10.623282 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.623279 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8923356e-911a-400b-896b-9c0b05aaccb1-lib-modules\") pod \"perf-node-gather-daemonset-2w875\" (UID: \"8923356e-911a-400b-896b-9c0b05aaccb1\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:10.713540 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.713510 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xc224_e3794d28-61ca-4d8d-9d47-c634fc191844/dns/0.log" Apr 21 06:41:10.724541 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.724515 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zslkj\" (UniqueName: \"kubernetes.io/projected/8923356e-911a-400b-896b-9c0b05aaccb1-kube-api-access-zslkj\") pod \"perf-node-gather-daemonset-2w875\" (UID: \"8923356e-911a-400b-896b-9c0b05aaccb1\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:10.724678 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.724586 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8923356e-911a-400b-896b-9c0b05aaccb1-proc\") pod \"perf-node-gather-daemonset-2w875\" (UID: \"8923356e-911a-400b-896b-9c0b05aaccb1\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:10.724678 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.724610 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8923356e-911a-400b-896b-9c0b05aaccb1-podres\") pod \"perf-node-gather-daemonset-2w875\" (UID: \"8923356e-911a-400b-896b-9c0b05aaccb1\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:10.724678 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.724641 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8923356e-911a-400b-896b-9c0b05aaccb1-sys\") pod \"perf-node-gather-daemonset-2w875\" (UID: \"8923356e-911a-400b-896b-9c0b05aaccb1\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:10.724791 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.724702 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8923356e-911a-400b-896b-9c0b05aaccb1-sys\") pod \"perf-node-gather-daemonset-2w875\" (UID: \"8923356e-911a-400b-896b-9c0b05aaccb1\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:10.724791 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.724700 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8923356e-911a-400b-896b-9c0b05aaccb1-lib-modules\") pod \"perf-node-gather-daemonset-2w875\" (UID: \"8923356e-911a-400b-896b-9c0b05aaccb1\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:10.724791 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.724704 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8923356e-911a-400b-896b-9c0b05aaccb1-proc\") pod \"perf-node-gather-daemonset-2w875\" (UID: \"8923356e-911a-400b-896b-9c0b05aaccb1\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:10.724791 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.724776 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8923356e-911a-400b-896b-9c0b05aaccb1-podres\") pod \"perf-node-gather-daemonset-2w875\" (UID: \"8923356e-911a-400b-896b-9c0b05aaccb1\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:10.724952 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.724805 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8923356e-911a-400b-896b-9c0b05aaccb1-lib-modules\") pod \"perf-node-gather-daemonset-2w875\" (UID: \"8923356e-911a-400b-896b-9c0b05aaccb1\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:10.732962 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.732942 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zslkj\" (UniqueName: \"kubernetes.io/projected/8923356e-911a-400b-896b-9c0b05aaccb1-kube-api-access-zslkj\") pod \"perf-node-gather-daemonset-2w875\" (UID: \"8923356e-911a-400b-896b-9c0b05aaccb1\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:10.735369 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.735351 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xc224_e3794d28-61ca-4d8d-9d47-c634fc191844/kube-rbac-proxy/0.log" Apr 21 06:41:10.779686 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.779658 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8z4fb_ea78721d-4fb0-4884-9dfd-d0be9bbc750b/dns-node-resolver/0.log" Apr 21 06:41:10.883234 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.883194 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:10.999732 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:10.999709 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875"] Apr 21 06:41:11.002200 ip-10-0-138-68 kubenswrapper[2570]: W0421 06:41:11.002175 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8923356e_911a_400b_896b_9c0b05aaccb1.slice/crio-9344bec911ad299a0f4d34e9bcd587743cb3b246b430859808de041186f014e0 WatchSource:0}: Error finding container 9344bec911ad299a0f4d34e9bcd587743cb3b246b430859808de041186f014e0: Status 404 returned error can't find the container with id 9344bec911ad299a0f4d34e9bcd587743cb3b246b430859808de041186f014e0 Apr 21 06:41:11.247997 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:11.247926 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-z2xrz_ada9d0db-7f80-4159-9e92-7fe71d0647f6/node-ca/0.log" Apr 21 06:41:11.870619 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:11.870581 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" event={"ID":"8923356e-911a-400b-896b-9c0b05aaccb1","Type":"ContainerStarted","Data":"bcdd001c1299ea25309d22b7e2ca32726db12f7b5ce71dce4b42789701eafc6f"} Apr 21 06:41:11.870619 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:11.870616 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" event={"ID":"8923356e-911a-400b-896b-9c0b05aaccb1","Type":"ContainerStarted","Data":"9344bec911ad299a0f4d34e9bcd587743cb3b246b430859808de041186f014e0"} Apr 21 06:41:11.871100 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:11.870678 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:11.885107 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:11.885061 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" podStartSLOduration=1.885045418 podStartE2EDuration="1.885045418s" podCreationTimestamp="2026-04-21 06:41:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:41:11.884008653 +0000 UTC m=+909.390473566" watchObservedRunningTime="2026-04-21 06:41:11.885045418 +0000 UTC m=+909.391510330" Apr 21 06:41:12.188377 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:12.188307 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2pzsh_5a6936ad-93fd-4f26-83cc-7a94f1ebcac9/serve-healthcheck-canary/0.log" Apr 21 06:41:12.662657 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:12.662628 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-85mq4_c570aac3-9017-4847-9f0d-2051fa8a7f0f/kube-rbac-proxy/0.log" Apr 21 06:41:12.681308 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:12.681283 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-85mq4_c570aac3-9017-4847-9f0d-2051fa8a7f0f/exporter/0.log" Apr 21 06:41:12.700446 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:12.700420 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-85mq4_c570aac3-9017-4847-9f0d-2051fa8a7f0f/extractor/0.log" Apr 21 06:41:17.883123 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:17.883094 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-2w875" Apr 21 06:41:18.263658 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:18.263579 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9kgs6_aa39b975-a320-4be6-9871-173b44b3bf1a/kube-multus-additional-cni-plugins/0.log" Apr 21 06:41:18.283842 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:18.283818 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9kgs6_aa39b975-a320-4be6-9871-173b44b3bf1a/egress-router-binary-copy/0.log" Apr 21 06:41:18.303220 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:18.303196 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9kgs6_aa39b975-a320-4be6-9871-173b44b3bf1a/cni-plugins/0.log" Apr 21 06:41:18.321492 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:18.321466 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9kgs6_aa39b975-a320-4be6-9871-173b44b3bf1a/bond-cni-plugin/0.log" Apr 21 06:41:18.362963 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:18.362933 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9kgs6_aa39b975-a320-4be6-9871-173b44b3bf1a/routeoverride-cni/0.log" Apr 21 06:41:18.385353 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:18.385327 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9kgs6_aa39b975-a320-4be6-9871-173b44b3bf1a/whereabouts-cni-bincopy/0.log" Apr 21 06:41:18.405214 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:18.405186 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9kgs6_aa39b975-a320-4be6-9871-173b44b3bf1a/whereabouts-cni/0.log" Apr 21 06:41:18.566104 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:18.566026 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bngnm_74dcd627-03e5-412a-b898-6f771a157832/kube-multus/0.log" Apr 21 06:41:18.729282 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:18.729256 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xhdsz_d0022157-8720-4a4c-8cf0-324fe8cb0e3f/network-metrics-daemon/0.log" Apr 21 06:41:18.748635 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:18.748603 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xhdsz_d0022157-8720-4a4c-8cf0-324fe8cb0e3f/kube-rbac-proxy/0.log" Apr 21 06:41:19.572329 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:19.572250 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78qjr_4860a8de-8ebf-4c37-b025-9aaf165b999b/ovn-controller/0.log" Apr 21 06:41:19.596213 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:19.596189 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78qjr_4860a8de-8ebf-4c37-b025-9aaf165b999b/ovn-acl-logging/0.log" Apr 21 06:41:19.600818 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:19.600777 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78qjr_4860a8de-8ebf-4c37-b025-9aaf165b999b/ovn-acl-logging/1.log" Apr 21 06:41:19.621838 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:19.621811 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78qjr_4860a8de-8ebf-4c37-b025-9aaf165b999b/kube-rbac-proxy-node/0.log" Apr 21 06:41:19.647413 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:19.647384 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78qjr_4860a8de-8ebf-4c37-b025-9aaf165b999b/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 06:41:19.669580 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:19.669551 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78qjr_4860a8de-8ebf-4c37-b025-9aaf165b999b/northd/0.log" Apr 21 06:41:19.690224 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:19.690191 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78qjr_4860a8de-8ebf-4c37-b025-9aaf165b999b/nbdb/0.log" Apr 21 06:41:19.709701 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:19.709671 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78qjr_4860a8de-8ebf-4c37-b025-9aaf165b999b/sbdb/0.log" Apr 21 06:41:19.803283 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:19.803248 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78qjr_4860a8de-8ebf-4c37-b025-9aaf165b999b/ovnkube-controller/0.log" Apr 21 06:41:21.366512 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:21.366474 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-pfzbp_aba5693c-c88c-45ce-9751-0d5e014097eb/network-check-target-container/0.log" Apr 21 06:41:22.280437 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:22.280412 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-kwxms_a1478fba-f9dc-413e-8354-ffdd0bcdaed2/iptables-alerter/0.log" Apr 21 06:41:22.833919 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:22.833894 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-7fmjt_bfc3125e-919d-4ff6-add5-623ba583cd1a/tuned/0.log" Apr 21 06:41:26.062223 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:26.062195 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-w9lpz_0814d57e-a465-4787-8668-7b52f9ae671d/csi-driver/0.log" Apr 21 06:41:26.081434 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:26.081406 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-w9lpz_0814d57e-a465-4787-8668-7b52f9ae671d/csi-node-driver-registrar/0.log" Apr 21 06:41:26.101573 ip-10-0-138-68 kubenswrapper[2570]: I0421 06:41:26.101546 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-w9lpz_0814d57e-a465-4787-8668-7b52f9ae671d/csi-liveness-probe/0.log"