Apr 24 19:06:41.266588 ip-10-0-137-23 systemd[1]: Starting Kubernetes Kubelet... Apr 24 19:06:41.745239 ip-10-0-137-23 kubenswrapper[2583]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:06:41.745239 ip-10-0-137-23 kubenswrapper[2583]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 19:06:41.745239 ip-10-0-137-23 kubenswrapper[2583]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:06:41.745239 ip-10-0-137-23 kubenswrapper[2583]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 19:06:41.745239 ip-10-0-137-23 kubenswrapper[2583]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:06:41.746995 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.746897 2583 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 19:06:41.749360 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749342 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:41.749360 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749360 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749364 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749367 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749370 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749373 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749376 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749379 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749382 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749384 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749387 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749390 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749392 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749395 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749397 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749402 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749404 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749407 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749410 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749413 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749415 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:41.749427 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749418 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749420 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749422 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749425 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749428 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749431 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749435 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749439 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749443 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749446 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749449 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749452 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749454 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749457 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749459 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749462 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749464 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749467 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749472 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:41.749910 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749474 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749477 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749479 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749482 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749484 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749488 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749491 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749494 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749497 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749499 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749502 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749504 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749507 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749510 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749512 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749516 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749518 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749521 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749524 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749526 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:41.750396 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749529 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749532 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749534 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749537 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749539 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749542 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749544 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749547 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749550 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749552 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749555 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749557 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749560 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749563 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749566 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749568 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749570 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749573 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749576 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749579 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:41.750872 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749582 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749585 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749589 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749592 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749595 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.749600 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750013 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750021 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750024 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750028 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750031 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750033 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750036 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750039 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750042 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750044 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750047 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750050 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750052 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:41.751377 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750054 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750057 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750060 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750062 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750065 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750067 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750070 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750072 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750076 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750079 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750082 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750085 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750088 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750091 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750094 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750096 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750099 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750101 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750104 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:41.751860 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750107 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750109 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750112 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750114 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750117 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750120 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750122 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750125 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750142 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750145 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750148 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750151 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750154 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750157 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750161 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750163 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750166 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750168 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750170 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750173 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:41.752391 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750176 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750178 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750181 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750183 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750186 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750189 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750191 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750194 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750197 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750199 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750202 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750204 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750207 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750209 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750213 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750217 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750219 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750222 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750225 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:41.752880 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750227 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750230 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750234 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750236 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750239 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750242 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750244 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750261 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750264 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750267 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750269 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750272 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750274 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750277 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750279 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750350 2583 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750362 2583 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750369 2583 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750374 2583 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750379 2583 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750382 2583 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 19:06:41.753379 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750387 2583 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750391 2583 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750395 2583 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750398 2583 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750402 2583 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750405 2583 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750409 2583 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750412 2583 flags.go:64] FLAG: --cgroup-root="" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750415 2583 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750418 2583 flags.go:64] FLAG: --client-ca-file="" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750421 2583 flags.go:64] FLAG: --cloud-config="" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750423 2583 flags.go:64] FLAG: --cloud-provider="external" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750426 2583 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750430 2583 flags.go:64] FLAG: --cluster-domain="" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750433 2583 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750437 2583 flags.go:64] FLAG: --config-dir="" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750440 2583 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750443 2583 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750447 2583 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750450 2583 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750452 2583 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750456 2583 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750459 2583 flags.go:64] FLAG: --contention-profiling="false" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750462 2583 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 19:06:41.753930 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750464 2583 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750467 2583 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750471 2583 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750475 2583 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750479 2583 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750482 2583 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750484 2583 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750488 2583 flags.go:64] FLAG: --enable-server="true" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750491 2583 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750497 2583 flags.go:64] FLAG: --event-burst="100" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750500 2583 flags.go:64] FLAG: --event-qps="50" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750503 2583 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750506 2583 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750509 2583 flags.go:64] FLAG: --eviction-hard="" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750513 2583 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750516 2583 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750519 2583 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750522 2583 flags.go:64] FLAG: --eviction-soft="" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750525 2583 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750528 2583 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750531 2583 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750534 2583 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750537 2583 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750540 2583 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750542 2583 flags.go:64] FLAG: --feature-gates="" Apr 24 19:06:41.754524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750547 2583 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750551 2583 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750554 2583 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750558 2583 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750561 2583 flags.go:64] FLAG: --healthz-port="10248" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750564 2583 flags.go:64] FLAG: --help="false" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750567 2583 flags.go:64] FLAG: --hostname-override="ip-10-0-137-23.ec2.internal" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750570 2583 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750573 2583 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750576 2583 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750579 2583 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750583 2583 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750586 2583 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750588 2583 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750591 2583 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750595 2583 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750598 2583 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750602 2583 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750605 2583 flags.go:64] FLAG: --kube-reserved="" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750607 2583 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750610 2583 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750614 2583 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750617 2583 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750620 2583 flags.go:64] FLAG: --lock-file="" Apr 24 19:06:41.755138 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750623 2583 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750626 2583 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750629 2583 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750634 2583 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750637 2583 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750640 2583 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750643 2583 flags.go:64] FLAG: --logging-format="text" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750645 2583 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750649 2583 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750652 2583 flags.go:64] FLAG: --manifest-url="" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750655 2583 flags.go:64] FLAG: --manifest-url-header="" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750660 2583 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750663 2583 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750667 2583 flags.go:64] FLAG: --max-pods="110" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750670 2583 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750673 2583 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750676 2583 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750679 2583 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750682 2583 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750685 2583 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750688 2583 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750696 2583 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750699 2583 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750702 2583 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 19:06:41.755755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750705 2583 flags.go:64] FLAG: --pod-cidr="" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750708 2583 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750715 2583 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750717 2583 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750721 2583 flags.go:64] FLAG: --pods-per-core="0" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750723 2583 flags.go:64] FLAG: --port="10250" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750726 2583 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750729 2583 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04f36224783753368" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750732 2583 flags.go:64] FLAG: --qos-reserved="" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750735 2583 flags.go:64] FLAG: --read-only-port="10255" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750738 2583 flags.go:64] FLAG: --register-node="true" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750741 2583 flags.go:64] FLAG: --register-schedulable="true" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750744 2583 flags.go:64] FLAG: --register-with-taints="" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750748 2583 flags.go:64] FLAG: --registry-burst="10" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750750 2583 flags.go:64] FLAG: --registry-qps="5" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750753 2583 flags.go:64] FLAG: --reserved-cpus="" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750756 2583 flags.go:64] FLAG: --reserved-memory="" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750760 2583 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750763 2583 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750766 2583 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750768 2583 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750771 2583 flags.go:64] FLAG: --runonce="false" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750774 2583 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750777 2583 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750780 2583 flags.go:64] FLAG: --seccomp-default="false" Apr 24 19:06:41.756350 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750783 2583 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750786 2583 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750789 2583 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750792 2583 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750795 2583 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750798 2583 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750801 2583 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750803 2583 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750807 2583 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750810 2583 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750813 2583 flags.go:64] FLAG: --system-cgroups="" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750816 2583 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750821 2583 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750824 2583 flags.go:64] FLAG: --tls-cert-file="" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750826 2583 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750830 2583 flags.go:64] FLAG: --tls-min-version="" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750833 2583 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750836 2583 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750838 2583 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750842 2583 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750845 2583 flags.go:64] FLAG: --v="2" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750849 2583 flags.go:64] FLAG: --version="false" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750853 2583 flags.go:64] FLAG: --vmodule="" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750858 2583 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.750861 2583 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 19:06:41.756948 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750955 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750959 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750962 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750964 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750967 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750969 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750973 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750975 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750979 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750982 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750985 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750988 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750991 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750995 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.750998 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751000 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751003 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751006 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751009 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751011 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:41.757567 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751014 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751016 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751018 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751021 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751023 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751026 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751028 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751031 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751034 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751036 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751039 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751041 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751045 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751047 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751050 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751052 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751055 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751057 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751059 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:41.758056 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751062 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751064 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751067 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751069 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751072 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751074 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751078 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751081 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751083 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751086 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751089 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751091 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751094 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751097 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751099 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751101 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751104 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751106 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751109 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751112 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:41.758580 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751114 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751117 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751119 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751121 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751124 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751128 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751130 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751133 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751135 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751137 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751141 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751143 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751146 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751148 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751151 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751153 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751156 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751158 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751162 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751165 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:41.759198 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751167 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:41.759869 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751170 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:41.759869 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751172 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:41.759869 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751177 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:41.759869 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751180 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:41.759869 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751183 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:41.759869 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.751186 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:41.759869 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.751902 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:06:41.761080 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.761055 2583 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 19:06:41.761125 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.761083 2583 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 19:06:41.761153 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761137 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:41.761153 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761142 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:41.761153 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761145 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:41.761153 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761149 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:41.761153 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761152 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:41.761153 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761155 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761158 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761161 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761164 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761167 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761170 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761172 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761175 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761177 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761180 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761183 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761185 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761188 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761191 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761194 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761198 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761203 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761206 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761210 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:41.761328 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761212 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761215 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761217 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761220 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761223 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761225 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761228 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761230 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761233 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761236 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761238 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761241 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761243 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761246 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761262 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761265 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761268 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761271 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761273 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761276 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:41.761790 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761279 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761282 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761285 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761288 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761290 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761293 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761296 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761299 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761301 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761304 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761307 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761310 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761313 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761315 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761318 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761320 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761323 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761325 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761328 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761330 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:41.762334 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761333 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761336 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761338 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761341 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761343 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761346 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761348 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761352 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761355 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761357 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761360 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761363 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761366 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761369 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761371 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761374 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761377 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761380 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761382 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761385 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:41.762831 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761387 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:41.763360 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761390 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:41.763360 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.761395 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:06:41.763360 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761495 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:41.763360 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761500 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:41.763360 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761503 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:41.763360 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761506 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:41.763360 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761510 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:41.763360 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761513 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:41.763360 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761515 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:41.763360 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761518 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:41.763360 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761521 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:41.763360 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761524 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:41.763360 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761526 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:41.763360 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761529 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:41.763360 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761532 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761535 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761537 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761540 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761543 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761546 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761548 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761551 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761554 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761557 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761562 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761565 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761568 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761571 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761574 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761576 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761579 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761581 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761584 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761586 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:41.763732 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761589 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761591 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761594 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761596 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761599 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761602 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761605 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761608 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761610 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761613 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761615 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761617 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761620 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761622 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761624 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761627 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761630 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761633 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761635 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:41.764209 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761638 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761641 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761644 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761646 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761649 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761652 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761655 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761657 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761659 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761662 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761664 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761667 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761669 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761672 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761674 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761676 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761679 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761681 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761684 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761686 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:41.764702 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761689 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:41.765356 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761691 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:41.765356 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761693 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:41.765356 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761696 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:41.765356 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761698 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:41.765356 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761700 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:41.765356 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761703 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:41.765356 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761705 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:41.765356 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761708 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:41.765356 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761710 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:41.765356 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761713 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:41.765356 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761716 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:41.765356 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761718 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:41.765356 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761721 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:41.765356 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:41.761723 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:41.765356 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.761728 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:06:41.765740 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.762420 2583 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 19:06:41.765908 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.765892 2583 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 19:06:41.766953 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.766941 2583 server.go:1019] "Starting client certificate rotation" Apr 24 19:06:41.767057 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.767039 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 19:06:41.767093 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.767077 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 19:06:41.794756 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.794730 2583 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 19:06:41.799166 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.799144 2583 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 19:06:41.816281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.816235 2583 log.go:25] "Validated CRI v1 runtime API" Apr 24 19:06:41.822595 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.822572 2583 log.go:25] "Validated CRI v1 image API" Apr 24 19:06:41.826199 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.826160 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 19:06:41.827201 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.827175 2583 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 19:06:41.830635 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.830610 2583 fs.go:135] Filesystem UUIDs: map[613184bd-1060-4fbc-b1bc-43021bbe8c88:/dev/nvme0n1p4 69368f5e-cbb7-4947-b9ea-b96ec2d78c41:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 24 19:06:41.830716 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.830634 2583 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 19:06:41.836526 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.836415 2583 manager.go:217] Machine: {Timestamp:2026-04-24 19:06:41.834390288 +0000 UTC m=+0.439681695 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098684 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2dc29a978293d7e8ad5d2551720a7a SystemUUID:ec2dc29a-9782-93d7-e8ad-5d2551720a7a BootID:dd19bfc0-5d77-42ea-a4d8-af0c370127db Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e4:79:3f:ab:83 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e4:79:3f:ab:83 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:06:e7:bd:a8:80:04 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 19:06:41.836526 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.836516 2583 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 19:06:41.836646 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.836602 2583 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 19:06:41.838003 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.837979 2583 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 19:06:41.838143 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.838006 2583 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-23.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 19:06:41.838195 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.838152 2583 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 19:06:41.838195 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.838162 2583 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 19:06:41.838195 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.838175 2583 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 19:06:41.838195 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.838187 2583 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 19:06:41.839075 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.839064 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 24 19:06:41.839181 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.839172 2583 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 19:06:41.842459 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.842445 2583 kubelet.go:491] "Attempting to sync node with API server" Apr 24 19:06:41.842515 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.842464 2583 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 19:06:41.842515 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.842481 2583 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 19:06:41.842515 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.842491 2583 kubelet.go:397] "Adding apiserver pod source" Apr 24 19:06:41.842515 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.842499 2583 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 19:06:41.843820 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.843807 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 19:06:41.843887 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.843827 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 19:06:41.846121 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.846101 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wghvh" Apr 24 19:06:41.847197 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.847178 2583 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 19:06:41.849130 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.849117 2583 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 19:06:41.850539 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.850525 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 19:06:41.850591 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.850550 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 19:06:41.850591 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.850562 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 19:06:41.850591 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.850571 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 19:06:41.850591 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.850577 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 19:06:41.850591 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.850583 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 19:06:41.850591 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.850588 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 19:06:41.850757 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.850596 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 19:06:41.850757 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.850604 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 19:06:41.850757 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.850610 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 19:06:41.850757 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.850623 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 19:06:41.850757 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.850636 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 19:06:41.852543 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.852531 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 19:06:41.852582 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.852547 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 19:06:41.854806 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.854785 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wghvh" Apr 24 19:06:41.855298 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:41.855271 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 19:06:41.855298 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:41.855288 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-23.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 19:06:41.855433 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.855382 2583 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-23.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 19:06:41.856465 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.856452 2583 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 19:06:41.856511 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.856494 2583 server.go:1295] "Started kubelet" Apr 24 19:06:41.856602 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.856577 2583 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 19:06:41.856719 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.856682 2583 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 19:06:41.856778 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.856739 2583 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 19:06:41.857510 ip-10-0-137-23 systemd[1]: Started Kubernetes Kubelet. Apr 24 19:06:41.857878 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.857860 2583 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 19:06:41.859504 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.859488 2583 server.go:317] "Adding debug handlers to kubelet server" Apr 24 19:06:41.864895 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.864871 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 19:06:41.865563 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.865546 2583 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 19:06:41.866232 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.866214 2583 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 19:06:41.866232 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.866230 2583 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 19:06:41.866414 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.866316 2583 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 19:06:41.866414 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.866356 2583 reconstruct.go:97] "Volume reconstruction finished" Apr 24 19:06:41.866414 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.866362 2583 reconciler.go:26] "Reconciler: start to sync state" Apr 24 19:06:41.866708 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:41.866546 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-23.ec2.internal\" not found" Apr 24 19:06:41.866708 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.866698 2583 factory.go:55] Registering systemd factory Apr 24 19:06:41.866790 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.866715 2583 factory.go:223] Registration of the systemd container factory successfully Apr 24 19:06:41.866963 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.866945 2583 factory.go:153] Registering CRI-O factory Apr 24 19:06:41.867017 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.866966 2583 factory.go:223] Registration of the crio container factory successfully Apr 24 19:06:41.867070 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.867017 2583 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 19:06:41.867070 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.867047 2583 factory.go:103] Registering Raw factory Apr 24 19:06:41.867070 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.867064 2583 manager.go:1196] Started watching for new ooms in manager Apr 24 19:06:41.867201 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:41.867137 2583 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 19:06:41.867461 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.867447 2583 manager.go:319] Starting recovery of all containers Apr 24 19:06:41.867884 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.867858 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:41.873045 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:41.873016 2583 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-23.ec2.internal\" not found" node="ip-10-0-137-23.ec2.internal" Apr 24 19:06:41.875897 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.875879 2583 manager.go:324] Recovery completed Apr 24 19:06:41.881419 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.881404 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:41.884416 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.884400 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:41.884500 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.884429 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:41.884500 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.884440 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:41.884970 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.884954 2583 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 19:06:41.884970 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.884968 2583 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 19:06:41.885063 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.884993 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 24 19:06:41.887406 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.887393 2583 policy_none.go:49] "None policy: Start" Apr 24 19:06:41.887453 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.887409 2583 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 19:06:41.887453 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.887419 2583 state_mem.go:35] "Initializing new in-memory state store" Apr 24 19:06:41.925875 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.925851 2583 manager.go:341] "Starting Device Plugin manager" Apr 24 19:06:41.926721 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:41.925899 2583 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 19:06:41.926721 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.925913 2583 server.go:85] "Starting device plugin registration server" Apr 24 19:06:41.926721 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.926226 2583 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 19:06:41.926721 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.926242 2583 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 19:06:41.926721 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.926388 2583 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 19:06:41.926721 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.926477 2583 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 19:06:41.926721 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:41.926487 2583 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 19:06:41.946644 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:41.926898 2583 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 19:06:41.946644 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:41.926938 2583 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-23.ec2.internal\" not found" Apr 24 19:06:42.011874 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.011793 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 19:06:42.013228 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.013198 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 19:06:42.013228 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.013230 2583 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 19:06:42.013446 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.013270 2583 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 19:06:42.013446 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.013278 2583 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 19:06:42.013446 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:42.013380 2583 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 19:06:42.015376 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.015352 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:42.026562 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.026529 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:42.029761 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.029743 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:42.029850 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.029777 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:42.029850 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.029788 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:42.029850 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.029810 2583 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.037963 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.037947 2583 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.038025 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:42.037974 2583 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-23.ec2.internal\": node \"ip-10-0-137-23.ec2.internal\" not found" Apr 24 19:06:42.053319 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:42.053286 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-23.ec2.internal\" not found" Apr 24 19:06:42.113834 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.113799 2583 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-23.ec2.internal"] Apr 24 19:06:42.113937 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.113887 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:42.114907 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.114891 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:42.114982 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.114923 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:42.114982 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.114934 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:42.116351 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.116339 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:42.116511 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.116498 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.116547 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.116529 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:42.117108 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.117092 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:42.117108 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.117118 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:42.117235 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.117134 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:42.117235 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.117194 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:42.117235 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.117217 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:42.117235 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.117230 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:42.118528 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.118514 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.118581 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.118537 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:42.119308 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.119290 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:42.119404 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.119324 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:42.119404 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.119339 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:42.140034 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:42.140011 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-23.ec2.internal\" not found" node="ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.144544 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:42.144524 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-23.ec2.internal\" not found" node="ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.154133 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:42.154108 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-23.ec2.internal\" not found" Apr 24 19:06:42.168547 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.168522 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2c6610a84971699450282b763f465fdb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal\" (UID: \"2c6610a84971699450282b763f465fdb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.168654 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.168556 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c6610a84971699450282b763f465fdb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal\" (UID: \"2c6610a84971699450282b763f465fdb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.168654 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.168576 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7361f60877a2c10988a706360ce354df-config\") pod \"kube-apiserver-proxy-ip-10-0-137-23.ec2.internal\" (UID: \"7361f60877a2c10988a706360ce354df\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.254828 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:42.254796 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-23.ec2.internal\" not found" Apr 24 19:06:42.269225 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.269167 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2c6610a84971699450282b763f465fdb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal\" (UID: \"2c6610a84971699450282b763f465fdb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.269225 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.269181 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2c6610a84971699450282b763f465fdb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal\" (UID: \"2c6610a84971699450282b763f465fdb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.269225 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.269214 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c6610a84971699450282b763f465fdb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal\" (UID: \"2c6610a84971699450282b763f465fdb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.269358 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.269233 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7361f60877a2c10988a706360ce354df-config\") pod \"kube-apiserver-proxy-ip-10-0-137-23.ec2.internal\" (UID: \"7361f60877a2c10988a706360ce354df\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.269358 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.269288 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7361f60877a2c10988a706360ce354df-config\") pod \"kube-apiserver-proxy-ip-10-0-137-23.ec2.internal\" (UID: \"7361f60877a2c10988a706360ce354df\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.269358 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.269313 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c6610a84971699450282b763f465fdb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal\" (UID: \"2c6610a84971699450282b763f465fdb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.355663 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:42.355628 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-23.ec2.internal\" not found" Apr 24 19:06:42.444176 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.444138 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.446884 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.446864 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.456653 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:42.456629 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-23.ec2.internal\" not found" Apr 24 19:06:42.557285 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:42.557155 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-23.ec2.internal\" not found" Apr 24 19:06:42.657754 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:42.657727 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-23.ec2.internal\" not found" Apr 24 19:06:42.758315 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:42.758246 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-23.ec2.internal\" not found" Apr 24 19:06:42.766450 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.766426 2583 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 19:06:42.766593 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.766576 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 19:06:42.766644 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.766576 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 19:06:42.856543 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.856312 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 19:01:41 +0000 UTC" deadline="2027-10-13 16:25:35.874103794 +0000 UTC" Apr 24 19:06:42.856543 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.856541 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12885h18m53.017568678s" Apr 24 19:06:42.859378 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:42.859354 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-23.ec2.internal\" not found" Apr 24 19:06:42.865185 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.865163 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 19:06:42.878441 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.878411 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 19:06:42.900817 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.900789 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-rwtzc" Apr 24 19:06:42.909564 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.909541 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-rwtzc" Apr 24 19:06:42.933685 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.933658 2583 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:42.939817 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.939765 2583 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:42.949643 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:42.949602 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c6610a84971699450282b763f465fdb.slice/crio-7b85d8b655283944c8ade46a2aa250fd5536283133d525c3d0af5e6dc13648c6 WatchSource:0}: Error finding container 7b85d8b655283944c8ade46a2aa250fd5536283133d525c3d0af5e6dc13648c6: Status 404 returned error can't find the container with id 7b85d8b655283944c8ade46a2aa250fd5536283133d525c3d0af5e6dc13648c6 Apr 24 19:06:42.950038 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:42.950012 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7361f60877a2c10988a706360ce354df.slice/crio-a8a0597a43d7476e11e8e51932c06c51490f192d6c4f6142ec7b6b87174f1662 WatchSource:0}: Error finding container a8a0597a43d7476e11e8e51932c06c51490f192d6c4f6142ec7b6b87174f1662: Status 404 returned error can't find the container with id a8a0597a43d7476e11e8e51932c06c51490f192d6c4f6142ec7b6b87174f1662 Apr 24 19:06:42.954123 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.954106 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:06:42.966894 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.966869 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.978267 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.978234 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 19:06:42.979283 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.979268 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-23.ec2.internal" Apr 24 19:06:42.988945 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:42.988927 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 19:06:43.017053 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.017002 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal" event={"ID":"2c6610a84971699450282b763f465fdb","Type":"ContainerStarted","Data":"7b85d8b655283944c8ade46a2aa250fd5536283133d525c3d0af5e6dc13648c6"} Apr 24 19:06:43.017935 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.017912 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-23.ec2.internal" event={"ID":"7361f60877a2c10988a706360ce354df","Type":"ContainerStarted","Data":"a8a0597a43d7476e11e8e51932c06c51490f192d6c4f6142ec7b6b87174f1662"} Apr 24 19:06:43.798469 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.798430 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:43.843846 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.843816 2583 apiserver.go:52] "Watching apiserver" Apr 24 19:06:43.854420 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.854393 2583 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 19:06:43.855474 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.855438 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-qdfd7","openshift-multus/network-metrics-daemon-f6x9g","kube-system/kube-apiserver-proxy-ip-10-0-137-23.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm","openshift-cluster-node-tuning-operator/tuned-z56hw","openshift-multus/multus-lv66n","openshift-network-diagnostics/network-check-target-n6v84","openshift-network-operator/iptables-alerter-xzp6g","openshift-ovn-kubernetes/ovnkube-node-b2ftq","kube-system/konnectivity-agent-fpdqf","openshift-image-registry/node-ca-mm5fz","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal"] Apr 24 19:06:43.857822 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.857340 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.858763 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.858498 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:06:43.858763 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:43.858579 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6x9g" podUID="c0ea34e5-a89a-4142-83d4-e94ef986bfa4" Apr 24 19:06:43.860781 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.860748 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.861505 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.861483 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 19:06:43.862091 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.861874 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kpm9h\"" Apr 24 19:06:43.862091 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.861903 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 19:06:43.862091 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.861903 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 19:06:43.862091 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.861954 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 19:06:43.862091 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.861906 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 19:06:43.863031 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.863011 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.864122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.864103 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.865636 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.865393 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:06:43.865636 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:43.865454 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6v84" podUID="12b20576-da14-4ba1-926b-fed787f86bfb" Apr 24 19:06:43.866706 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.866688 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xzp6g" Apr 24 19:06:43.868152 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.868048 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.869458 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.869436 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mm5fz" Apr 24 19:06:43.869560 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.869501 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fpdqf" Apr 24 19:06:43.877894 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.877870 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-slash\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.878015 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.877905 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-var-lib-openvswitch\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.878015 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.877933 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-run-ovn\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.878015 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.877957 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq5pz\" (UniqueName: \"kubernetes.io/projected/c4078663-d8da-42c2-8049-c863b1b49ea9-kube-api-access-fq5pz\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.878015 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.877983 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-host-run-k8s-cni-cncf-io\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.878015 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878007 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-etc-openvswitch\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.878243 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878030 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs\") pod \"network-metrics-daemon-f6x9g\" (UID: \"c0ea34e5-a89a-4142-83d4-e94ef986bfa4\") " pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:06:43.878243 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878052 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-run\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.878243 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878074 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-systemd-units\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.878243 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878107 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.878243 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878136 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a4ef86a-6412-43fa-ba15-979962cfdfad-cnibin\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.878243 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878165 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2a4ef86a-6412-43fa-ba15-979962cfdfad-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.878243 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878194 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6dc56d95-2724-41c7-beac-3fc12b5e8960-socket-dir\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.878243 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878222 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6dc56d95-2724-41c7-beac-3fc12b5e8960-etc-selinux\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.878560 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878280 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-sysconfig\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.878560 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878307 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-cni-bin\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.878560 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878328 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-kubernetes\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.878560 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878347 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-lib-modules\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.878560 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878388 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-host-var-lib-cni-bin\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.878560 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878409 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-kubelet\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.878560 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878426 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-host-var-lib-kubelet\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.878560 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878442 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-node-log\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.878560 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878465 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a4ef86a-6412-43fa-ba15-979962cfdfad-os-release\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.878560 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878508 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx58x\" (UniqueName: \"kubernetes.io/projected/6dc56d95-2724-41c7-beac-3fc12b5e8960-kube-api-access-cx58x\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.878560 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878546 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-system-cni-dir\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.878963 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878569 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-multus-socket-dir-parent\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.878963 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878616 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-hostroot\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.878963 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878640 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-multus-conf-dir\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.878963 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878662 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/39a410d7-8c61-49c0-8950-af91f35238f3-multus-daemon-config\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.878963 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878687 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-run-openvswitch\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.878963 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878724 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-run-ovn-kubernetes\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.878963 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878755 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-tuned\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.878963 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878803 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-os-release\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.878963 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878832 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/39a410d7-8c61-49c0-8950-af91f35238f3-cni-binary-copy\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.878963 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878847 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-etc-kubernetes\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.878963 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878870 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3a2993c8-f0a2-46ca-be46-e34e78416219-iptables-alerter-script\") pod \"iptables-alerter-xzp6g\" (UID: \"3a2993c8-f0a2-46ca-be46-e34e78416219\") " pod="openshift-network-operator/iptables-alerter-xzp6g" Apr 24 19:06:43.878963 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878891 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a2993c8-f0a2-46ca-be46-e34e78416219-host-slash\") pod \"iptables-alerter-xzp6g\" (UID: \"3a2993c8-f0a2-46ca-be46-e34e78416219\") " pod="openshift-network-operator/iptables-alerter-xzp6g" Apr 24 19:06:43.878963 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878914 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvm4v\" (UniqueName: \"kubernetes.io/projected/3a2993c8-f0a2-46ca-be46-e34e78416219-kube-api-access-mvm4v\") pod \"iptables-alerter-xzp6g\" (UID: \"3a2993c8-f0a2-46ca-be46-e34e78416219\") " pod="openshift-network-operator/iptables-alerter-xzp6g" Apr 24 19:06:43.878963 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878938 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-cni-netd\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.878963 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878960 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrn9g\" (UniqueName: \"kubernetes.io/projected/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-kube-api-access-xrn9g\") pod \"network-metrics-daemon-f6x9g\" (UID: \"c0ea34e5-a89a-4142-83d4-e94ef986bfa4\") " pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:06:43.879635 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.878984 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-modprobe-d\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.879635 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879005 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hznp\" (UniqueName: \"kubernetes.io/projected/39a410d7-8c61-49c0-8950-af91f35238f3-kube-api-access-9hznp\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.879635 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879029 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a4ef86a-6412-43fa-ba15-979962cfdfad-system-cni-dir\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.879635 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879053 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-host\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.879635 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879071 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-run-netns\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.879635 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879086 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a4ef86a-6412-43fa-ba15-979962cfdfad-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.879635 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879104 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-systemd\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.879635 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879125 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c4078663-d8da-42c2-8049-c863b1b49ea9-tmp\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.879635 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879150 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-multus-cni-dir\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.879635 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879175 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-cnibin\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.879635 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879193 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-host-run-multus-certs\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.879635 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879215 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-log-socket\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.879635 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879276 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-ovnkube-config\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.879635 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879312 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8zlr\" (UniqueName: \"kubernetes.io/projected/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-kube-api-access-l8zlr\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.879635 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879343 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a4ef86a-6412-43fa-ba15-979962cfdfad-cni-binary-copy\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.879635 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879369 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2a4ef86a-6412-43fa-ba15-979962cfdfad-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.880330 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879396 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6drm\" (UniqueName: \"kubernetes.io/projected/2a4ef86a-6412-43fa-ba15-979962cfdfad-kube-api-access-v6drm\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.880330 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879418 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6dc56d95-2724-41c7-beac-3fc12b5e8960-device-dir\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.880330 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879465 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-sysctl-d\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.880330 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879488 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-host-run-netns\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.880330 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879512 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-ovnkube-script-lib\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.880330 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879536 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6dc56d95-2724-41c7-beac-3fc12b5e8960-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.880330 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879607 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-var-lib-kubelet\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.880330 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879633 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-host-var-lib-cni-multus\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.880330 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879692 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-run-systemd\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.880330 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879715 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-ovn-node-metrics-cert\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.880330 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879738 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6dc56d95-2724-41c7-beac-3fc12b5e8960-sys-fs\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.880330 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879785 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c45x4\" (UniqueName: \"kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4\") pod \"network-check-target-n6v84\" (UID: \"12b20576-da14-4ba1-926b-fed787f86bfb\") " pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:06:43.880330 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879816 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-sys\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.880330 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879844 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-env-overrides\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.880330 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879868 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6dc56d95-2724-41c7-beac-3fc12b5e8960-registration-dir\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.880330 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.879900 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-sysctl-conf\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.887573 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.885858 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:06:43.887573 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.885885 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 19:06:43.887573 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.885858 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 19:06:43.887573 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.886011 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 19:06:43.887573 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.886122 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 19:06:43.887573 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.886180 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 19:06:43.887573 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.886976 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-nzf4q\"" Apr 24 19:06:43.887573 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.887151 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:06:43.887573 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.887400 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 19:06:43.887573 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.887419 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 19:06:43.888109 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.887640 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 19:06:43.888109 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.887652 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 19:06:43.888109 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.887933 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 19:06:43.888109 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.887964 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 19:06:43.888109 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.887991 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 19:06:43.888364 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.888190 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 19:06:43.888364 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.888338 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-clc5q\"" Apr 24 19:06:43.888364 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.888361 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 19:06:43.888507 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.888407 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 19:06:43.888507 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.888474 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 19:06:43.888654 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.888582 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-pvkkf\"" Apr 24 19:06:43.888704 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.888660 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-2kjfz\"" Apr 24 19:06:43.888704 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.888674 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-lf2vs\"" Apr 24 19:06:43.888704 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.888682 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-q9lm4\"" Apr 24 19:06:43.888840 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.888773 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 19:06:43.889471 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.889453 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 19:06:43.890366 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.890347 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-62ck5\"" Apr 24 19:06:43.910226 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.910188 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 19:01:42 +0000 UTC" deadline="2027-11-07 21:08:23.964192971 +0000 UTC" Apr 24 19:06:43.910226 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.910224 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13490h1m40.053972788s" Apr 24 19:06:43.967716 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.967690 2583 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 19:06:43.980879 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.980842 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-sysctl-d\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.980879 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.980881 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-host-run-netns\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.981121 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.980901 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-ovnkube-script-lib\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.981121 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.980919 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6dc56d95-2724-41c7-beac-3fc12b5e8960-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.981121 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.980998 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6dc56d95-2724-41c7-beac-3fc12b5e8960-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.981121 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981003 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-host-run-netns\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.981121 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981042 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1daf02a1-51a2-4eb1-a1b5-ed9667d63027-konnectivity-ca\") pod \"konnectivity-agent-fpdqf\" (UID: \"1daf02a1-51a2-4eb1-a1b5-ed9667d63027\") " pod="kube-system/konnectivity-agent-fpdqf" Apr 24 19:06:43.981121 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981064 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-sysctl-d\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.981121 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981094 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c359622a-4d36-4dcb-b06d-e8b0a4c453ad-serviceca\") pod \"node-ca-mm5fz\" (UID: \"c359622a-4d36-4dcb-b06d-e8b0a4c453ad\") " pod="openshift-image-registry/node-ca-mm5fz" Apr 24 19:06:43.981450 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981130 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-var-lib-kubelet\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.981450 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981168 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-var-lib-kubelet\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.981450 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981208 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-host-var-lib-cni-multus\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.981450 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981280 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-run-systemd\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.981450 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981307 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-host-var-lib-cni-multus\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.981450 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981357 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-ovn-node-metrics-cert\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.981450 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981384 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6dc56d95-2724-41c7-beac-3fc12b5e8960-sys-fs\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.981754 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981473 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c45x4\" (UniqueName: \"kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4\") pod \"network-check-target-n6v84\" (UID: \"12b20576-da14-4ba1-926b-fed787f86bfb\") " pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:06:43.981754 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981520 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-sys\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.981754 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981555 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-env-overrides\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.981754 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981645 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-sys\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.981754 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981646 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6dc56d95-2724-41c7-beac-3fc12b5e8960-registration-dir\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.981754 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981695 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-ovnkube-script-lib\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.981754 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981705 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6dc56d95-2724-41c7-beac-3fc12b5e8960-registration-dir\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.981754 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981696 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1daf02a1-51a2-4eb1-a1b5-ed9667d63027-agent-certs\") pod \"konnectivity-agent-fpdqf\" (UID: \"1daf02a1-51a2-4eb1-a1b5-ed9667d63027\") " pod="kube-system/konnectivity-agent-fpdqf" Apr 24 19:06:43.981754 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981752 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-sysctl-conf\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.981754 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981757 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6dc56d95-2724-41c7-beac-3fc12b5e8960-sys-fs\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.982398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981785 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-slash\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.982398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981830 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-var-lib-openvswitch\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.982398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981866 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-sysctl-conf\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.982398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981892 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-run-ovn\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.982398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981896 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-run-systemd\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.982398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981918 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fq5pz\" (UniqueName: \"kubernetes.io/projected/c4078663-d8da-42c2-8049-c863b1b49ea9-kube-api-access-fq5pz\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.982398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981923 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-slash\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.982398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981951 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-run-ovn\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.982398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981967 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-var-lib-openvswitch\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.982398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.981997 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-host-run-k8s-cni-cncf-io\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.982398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982014 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-etc-openvswitch\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.982398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982026 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-env-overrides\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.982398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982035 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-etc-openvswitch\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.982398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982052 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs\") pod \"network-metrics-daemon-f6x9g\" (UID: \"c0ea34e5-a89a-4142-83d4-e94ef986bfa4\") " pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:06:43.982398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982079 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-host-run-k8s-cni-cncf-io\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.982398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982114 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-run\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.982398 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:43.982173 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:43.982398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982225 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-systemd-units\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.983204 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982265 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-run\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.983204 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:43.982292 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs podName:c0ea34e5-a89a-4142-83d4-e94ef986bfa4 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:44.482224049 +0000 UTC m=+3.087515448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs") pod "network-metrics-daemon-f6x9g" (UID: "c0ea34e5-a89a-4142-83d4-e94ef986bfa4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:43.983204 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982318 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-systemd-units\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.983204 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982347 2583 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 19:06:43.983204 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982378 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.983204 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982410 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a4ef86a-6412-43fa-ba15-979962cfdfad-cnibin\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.983204 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982435 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2a4ef86a-6412-43fa-ba15-979962cfdfad-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.983204 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982410 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.983204 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982456 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6dc56d95-2724-41c7-beac-3fc12b5e8960-socket-dir\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.983204 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982495 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6dc56d95-2724-41c7-beac-3fc12b5e8960-etc-selinux\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.983204 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982524 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl9vp\" (UniqueName: \"kubernetes.io/projected/c359622a-4d36-4dcb-b06d-e8b0a4c453ad-kube-api-access-gl9vp\") pod \"node-ca-mm5fz\" (UID: \"c359622a-4d36-4dcb-b06d-e8b0a4c453ad\") " pod="openshift-image-registry/node-ca-mm5fz" Apr 24 19:06:43.983204 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982543 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6dc56d95-2724-41c7-beac-3fc12b5e8960-socket-dir\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.983204 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982551 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-sysconfig\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.983204 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982580 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-cni-bin\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.983204 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982596 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-sysconfig\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.983204 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982607 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-kubernetes\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.986318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982631 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-lib-modules\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.986318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982636 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a4ef86a-6412-43fa-ba15-979962cfdfad-cnibin\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.986318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982654 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-host-var-lib-cni-bin\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.986318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982677 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-kubelet\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.986318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982681 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6dc56d95-2724-41c7-beac-3fc12b5e8960-etc-selinux\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.986318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982703 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c359622a-4d36-4dcb-b06d-e8b0a4c453ad-host\") pod \"node-ca-mm5fz\" (UID: \"c359622a-4d36-4dcb-b06d-e8b0a4c453ad\") " pod="openshift-image-registry/node-ca-mm5fz" Apr 24 19:06:43.986318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982723 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-host-var-lib-cni-bin\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.986318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982727 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-kubernetes\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.986318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982734 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-host-var-lib-kubelet\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.986318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982778 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-kubelet\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.986318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982788 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-host-var-lib-kubelet\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.986318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982791 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-cni-bin\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.986318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982794 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-lib-modules\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.986318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982806 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-node-log\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.986318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982826 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a4ef86a-6412-43fa-ba15-979962cfdfad-os-release\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.986318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982862 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cx58x\" (UniqueName: \"kubernetes.io/projected/6dc56d95-2724-41c7-beac-3fc12b5e8960-kube-api-access-cx58x\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.986318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982888 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-system-cni-dir\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.986318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982894 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-node-log\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.987122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982922 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a4ef86a-6412-43fa-ba15-979962cfdfad-os-release\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.987122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982936 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-system-cni-dir\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.987122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982956 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-multus-socket-dir-parent\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.987122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982977 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-hostroot\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.987122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.982998 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2a4ef86a-6412-43fa-ba15-979962cfdfad-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.987122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983002 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-multus-conf-dir\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.987122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983013 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-multus-socket-dir-parent\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.987122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983030 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-multus-conf-dir\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.987122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983035 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-hostroot\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.987122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983040 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/39a410d7-8c61-49c0-8950-af91f35238f3-multus-daemon-config\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.987122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983062 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-run-openvswitch\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.987122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983082 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-run-ovn-kubernetes\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.987122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983120 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-tuned\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.987122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983124 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-run-openvswitch\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.987122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983141 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-os-release\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.987122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983161 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/39a410d7-8c61-49c0-8950-af91f35238f3-cni-binary-copy\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.987122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983165 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-run-ovn-kubernetes\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.987122 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983217 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-os-release\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.987809 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983274 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-etc-kubernetes\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.987809 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983317 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-etc-kubernetes\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.987809 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983331 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3a2993c8-f0a2-46ca-be46-e34e78416219-iptables-alerter-script\") pod \"iptables-alerter-xzp6g\" (UID: \"3a2993c8-f0a2-46ca-be46-e34e78416219\") " pod="openshift-network-operator/iptables-alerter-xzp6g" Apr 24 19:06:43.987809 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983366 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a2993c8-f0a2-46ca-be46-e34e78416219-host-slash\") pod \"iptables-alerter-xzp6g\" (UID: \"3a2993c8-f0a2-46ca-be46-e34e78416219\") " pod="openshift-network-operator/iptables-alerter-xzp6g" Apr 24 19:06:43.987809 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983387 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvm4v\" (UniqueName: \"kubernetes.io/projected/3a2993c8-f0a2-46ca-be46-e34e78416219-kube-api-access-mvm4v\") pod \"iptables-alerter-xzp6g\" (UID: \"3a2993c8-f0a2-46ca-be46-e34e78416219\") " pod="openshift-network-operator/iptables-alerter-xzp6g" Apr 24 19:06:43.987809 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983408 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-cni-netd\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.987809 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983428 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrn9g\" (UniqueName: \"kubernetes.io/projected/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-kube-api-access-xrn9g\") pod \"network-metrics-daemon-f6x9g\" (UID: \"c0ea34e5-a89a-4142-83d4-e94ef986bfa4\") " pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:06:43.987809 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983465 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-modprobe-d\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.987809 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983484 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hznp\" (UniqueName: \"kubernetes.io/projected/39a410d7-8c61-49c0-8950-af91f35238f3-kube-api-access-9hznp\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.987809 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983505 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a4ef86a-6412-43fa-ba15-979962cfdfad-system-cni-dir\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.987809 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983506 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a2993c8-f0a2-46ca-be46-e34e78416219-host-slash\") pod \"iptables-alerter-xzp6g\" (UID: \"3a2993c8-f0a2-46ca-be46-e34e78416219\") " pod="openshift-network-operator/iptables-alerter-xzp6g" Apr 24 19:06:43.987809 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983525 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-host\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.987809 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983597 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/39a410d7-8c61-49c0-8950-af91f35238f3-multus-daemon-config\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.987809 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983690 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-run-netns\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.987809 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983754 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/39a410d7-8c61-49c0-8950-af91f35238f3-cni-binary-copy\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.987809 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983755 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a4ef86a-6412-43fa-ba15-979962cfdfad-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.987809 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983803 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-host\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.988281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983836 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a4ef86a-6412-43fa-ba15-979962cfdfad-system-cni-dir\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.988281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983843 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-modprobe-d\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.988281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983858 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-cni-netd\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.988281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983868 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-host-run-netns\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.988281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983948 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-systemd\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.988281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983973 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c4078663-d8da-42c2-8049-c863b1b49ea9-tmp\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.988281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983990 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-systemd\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.988281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.983996 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-multus-cni-dir\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.988281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984033 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-cnibin\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.988281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984052 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-multus-cni-dir\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.988281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984055 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-host-run-multus-certs\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.988281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984092 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-log-socket\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.988281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984104 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-cnibin\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.988281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984116 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-ovnkube-config\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.988281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984126 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a4ef86a-6412-43fa-ba15-979962cfdfad-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.988281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984148 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/39a410d7-8c61-49c0-8950-af91f35238f3-host-run-multus-certs\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:43.988281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984173 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-log-socket\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.988281 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984172 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8zlr\" (UniqueName: \"kubernetes.io/projected/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-kube-api-access-l8zlr\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.989024 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984204 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a4ef86a-6412-43fa-ba15-979962cfdfad-cni-binary-copy\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.989024 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984223 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2a4ef86a-6412-43fa-ba15-979962cfdfad-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.989024 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984266 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6drm\" (UniqueName: \"kubernetes.io/projected/2a4ef86a-6412-43fa-ba15-979962cfdfad-kube-api-access-v6drm\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.989024 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984302 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6dc56d95-2724-41c7-beac-3fc12b5e8960-device-dir\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.989024 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984372 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6dc56d95-2724-41c7-beac-3fc12b5e8960-device-dir\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.989024 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984712 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a4ef86a-6412-43fa-ba15-979962cfdfad-cni-binary-copy\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.989024 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984879 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-ovnkube-config\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.989024 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984900 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2a4ef86a-6412-43fa-ba15-979962cfdfad-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.989024 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.984917 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3a2993c8-f0a2-46ca-be46-e34e78416219-iptables-alerter-script\") pod \"iptables-alerter-xzp6g\" (UID: \"3a2993c8-f0a2-46ca-be46-e34e78416219\") " pod="openshift-network-operator/iptables-alerter-xzp6g" Apr 24 19:06:43.989024 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.986979 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c4078663-d8da-42c2-8049-c863b1b49ea9-etc-tuned\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.989024 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.987007 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c4078663-d8da-42c2-8049-c863b1b49ea9-tmp\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.989024 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.987067 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-ovn-node-metrics-cert\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.989024 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:43.987555 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:43.989024 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:43.987573 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:43.989024 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:43.987586 2583 projected.go:194] Error preparing data for projected volume kube-api-access-c45x4 for pod openshift-network-diagnostics/network-check-target-n6v84: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:43.989024 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:43.987650 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4 podName:12b20576-da14-4ba1-926b-fed787f86bfb nodeName:}" failed. No retries permitted until 2026-04-24 19:06:44.487632387 +0000 UTC m=+3.092923798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-c45x4" (UniqueName: "kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4") pod "network-check-target-n6v84" (UID: "12b20576-da14-4ba1-926b-fed787f86bfb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:43.989588 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.989570 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq5pz\" (UniqueName: \"kubernetes.io/projected/c4078663-d8da-42c2-8049-c863b1b49ea9-kube-api-access-fq5pz\") pod \"tuned-z56hw\" (UID: \"c4078663-d8da-42c2-8049-c863b1b49ea9\") " pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:43.993967 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.993930 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6drm\" (UniqueName: \"kubernetes.io/projected/2a4ef86a-6412-43fa-ba15-979962cfdfad-kube-api-access-v6drm\") pod \"multus-additional-cni-plugins-qdfd7\" (UID: \"2a4ef86a-6412-43fa-ba15-979962cfdfad\") " pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:43.994474 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.994446 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8zlr\" (UniqueName: \"kubernetes.io/projected/c3b9dbc7-3f43-4d25-9375-c9c4859dd641-kube-api-access-l8zlr\") pod \"ovnkube-node-b2ftq\" (UID: \"c3b9dbc7-3f43-4d25-9375-c9c4859dd641\") " pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:43.994868 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.994839 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrn9g\" (UniqueName: \"kubernetes.io/projected/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-kube-api-access-xrn9g\") pod \"network-metrics-daemon-f6x9g\" (UID: \"c0ea34e5-a89a-4142-83d4-e94ef986bfa4\") " pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:06:43.995305 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.995282 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx58x\" (UniqueName: \"kubernetes.io/projected/6dc56d95-2724-41c7-beac-3fc12b5e8960-kube-api-access-cx58x\") pod \"aws-ebs-csi-driver-node-2s7zm\" (UID: \"6dc56d95-2724-41c7-beac-3fc12b5e8960\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:43.995391 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.995315 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvm4v\" (UniqueName: \"kubernetes.io/projected/3a2993c8-f0a2-46ca-be46-e34e78416219-kube-api-access-mvm4v\") pod \"iptables-alerter-xzp6g\" (UID: \"3a2993c8-f0a2-46ca-be46-e34e78416219\") " pod="openshift-network-operator/iptables-alerter-xzp6g" Apr 24 19:06:43.996300 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:43.996273 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hznp\" (UniqueName: \"kubernetes.io/projected/39a410d7-8c61-49c0-8950-af91f35238f3-kube-api-access-9hznp\") pod \"multus-lv66n\" (UID: \"39a410d7-8c61-49c0-8950-af91f35238f3\") " pod="openshift-multus/multus-lv66n" Apr 24 19:06:44.084827 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.084738 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1daf02a1-51a2-4eb1-a1b5-ed9667d63027-agent-certs\") pod \"konnectivity-agent-fpdqf\" (UID: \"1daf02a1-51a2-4eb1-a1b5-ed9667d63027\") " pod="kube-system/konnectivity-agent-fpdqf" Apr 24 19:06:44.084827 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.084804 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl9vp\" (UniqueName: \"kubernetes.io/projected/c359622a-4d36-4dcb-b06d-e8b0a4c453ad-kube-api-access-gl9vp\") pod \"node-ca-mm5fz\" (UID: \"c359622a-4d36-4dcb-b06d-e8b0a4c453ad\") " pod="openshift-image-registry/node-ca-mm5fz" Apr 24 19:06:44.085043 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.084834 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c359622a-4d36-4dcb-b06d-e8b0a4c453ad-host\") pod \"node-ca-mm5fz\" (UID: \"c359622a-4d36-4dcb-b06d-e8b0a4c453ad\") " pod="openshift-image-registry/node-ca-mm5fz" Apr 24 19:06:44.085043 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.084903 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1daf02a1-51a2-4eb1-a1b5-ed9667d63027-konnectivity-ca\") pod \"konnectivity-agent-fpdqf\" (UID: \"1daf02a1-51a2-4eb1-a1b5-ed9667d63027\") " pod="kube-system/konnectivity-agent-fpdqf" Apr 24 19:06:44.085043 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.084927 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c359622a-4d36-4dcb-b06d-e8b0a4c453ad-serviceca\") pod \"node-ca-mm5fz\" (UID: \"c359622a-4d36-4dcb-b06d-e8b0a4c453ad\") " pod="openshift-image-registry/node-ca-mm5fz" Apr 24 19:06:44.085043 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.084955 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c359622a-4d36-4dcb-b06d-e8b0a4c453ad-host\") pod \"node-ca-mm5fz\" (UID: \"c359622a-4d36-4dcb-b06d-e8b0a4c453ad\") " pod="openshift-image-registry/node-ca-mm5fz" Apr 24 19:06:44.085475 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.085448 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c359622a-4d36-4dcb-b06d-e8b0a4c453ad-serviceca\") pod \"node-ca-mm5fz\" (UID: \"c359622a-4d36-4dcb-b06d-e8b0a4c453ad\") " pod="openshift-image-registry/node-ca-mm5fz" Apr 24 19:06:44.085607 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.085533 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1daf02a1-51a2-4eb1-a1b5-ed9667d63027-konnectivity-ca\") pod \"konnectivity-agent-fpdqf\" (UID: \"1daf02a1-51a2-4eb1-a1b5-ed9667d63027\") " pod="kube-system/konnectivity-agent-fpdqf" Apr 24 19:06:44.087634 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.087612 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1daf02a1-51a2-4eb1-a1b5-ed9667d63027-agent-certs\") pod \"konnectivity-agent-fpdqf\" (UID: \"1daf02a1-51a2-4eb1-a1b5-ed9667d63027\") " pod="kube-system/konnectivity-agent-fpdqf" Apr 24 19:06:44.092898 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.092863 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl9vp\" (UniqueName: \"kubernetes.io/projected/c359622a-4d36-4dcb-b06d-e8b0a4c453ad-kube-api-access-gl9vp\") pod \"node-ca-mm5fz\" (UID: \"c359622a-4d36-4dcb-b06d-e8b0a4c453ad\") " pod="openshift-image-registry/node-ca-mm5fz" Apr 24 19:06:44.168913 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.168856 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qdfd7" Apr 24 19:06:44.176737 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.176710 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" Apr 24 19:06:44.185304 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.185278 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-z56hw" Apr 24 19:06:44.190986 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.190957 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lv66n" Apr 24 19:06:44.198686 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.198648 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xzp6g" Apr 24 19:06:44.210435 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.210408 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:06:44.218062 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.218032 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mm5fz" Apr 24 19:06:44.222701 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.222681 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fpdqf" Apr 24 19:06:44.334758 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.334721 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:44.487160 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.487069 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs\") pod \"network-metrics-daemon-f6x9g\" (UID: \"c0ea34e5-a89a-4142-83d4-e94ef986bfa4\") " pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:06:44.487331 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:44.487218 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:44.487331 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:44.487314 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs podName:c0ea34e5-a89a-4142-83d4-e94ef986bfa4 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:45.487292263 +0000 UTC m=+4.092583678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs") pod "network-metrics-daemon-f6x9g" (UID: "c0ea34e5-a89a-4142-83d4-e94ef986bfa4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:44.555711 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:44.555647 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4078663_d8da_42c2_8049_c863b1b49ea9.slice/crio-a43fb01a5026fa6cd9933e83f02694eb9aea801f87b756f5cdf8e2ef402c8e5d WatchSource:0}: Error finding container a43fb01a5026fa6cd9933e83f02694eb9aea801f87b756f5cdf8e2ef402c8e5d: Status 404 returned error can't find the container with id a43fb01a5026fa6cd9933e83f02694eb9aea801f87b756f5cdf8e2ef402c8e5d Apr 24 19:06:44.558968 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:44.558926 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dc56d95_2724_41c7_beac_3fc12b5e8960.slice/crio-e23b5e6d63b85f4356bc367a3ca929f4939852a999e7288471bd26be5c88108d WatchSource:0}: Error finding container e23b5e6d63b85f4356bc367a3ca929f4939852a999e7288471bd26be5c88108d: Status 404 returned error can't find the container with id e23b5e6d63b85f4356bc367a3ca929f4939852a999e7288471bd26be5c88108d Apr 24 19:06:44.561162 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:44.561132 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3b9dbc7_3f43_4d25_9375_c9c4859dd641.slice/crio-67ca473b46f56aa02635a15e41ce33e28f346c199f902c1a0370c7eabe509309 WatchSource:0}: Error finding container 67ca473b46f56aa02635a15e41ce33e28f346c199f902c1a0370c7eabe509309: Status 404 returned error can't find the container with id 67ca473b46f56aa02635a15e41ce33e28f346c199f902c1a0370c7eabe509309 Apr 24 19:06:44.562605 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:44.562580 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39a410d7_8c61_49c0_8950_af91f35238f3.slice/crio-d2ff69e2a02e89d16ae4c00c234c3dcdc9613d302e225d54bf3f848aecba3f59 WatchSource:0}: Error finding container d2ff69e2a02e89d16ae4c00c234c3dcdc9613d302e225d54bf3f848aecba3f59: Status 404 returned error can't find the container with id d2ff69e2a02e89d16ae4c00c234c3dcdc9613d302e225d54bf3f848aecba3f59 Apr 24 19:06:44.563797 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:44.563622 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1daf02a1_51a2_4eb1_a1b5_ed9667d63027.slice/crio-2ff2983eedcbe9a21a16499b99d97a7726d74897f09b14362c51aa6d9d8f7076 WatchSource:0}: Error finding container 2ff2983eedcbe9a21a16499b99d97a7726d74897f09b14362c51aa6d9d8f7076: Status 404 returned error can't find the container with id 2ff2983eedcbe9a21a16499b99d97a7726d74897f09b14362c51aa6d9d8f7076 Apr 24 19:06:44.564923 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:44.564714 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a2993c8_f0a2_46ca_be46_e34e78416219.slice/crio-b276a255dee43c984e05fd3e6f37f32fc1a2e032e63cafe7d8b4bf294309bb76 WatchSource:0}: Error finding container b276a255dee43c984e05fd3e6f37f32fc1a2e032e63cafe7d8b4bf294309bb76: Status 404 returned error can't find the container with id b276a255dee43c984e05fd3e6f37f32fc1a2e032e63cafe7d8b4bf294309bb76 Apr 24 19:06:44.566037 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:44.565979 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc359622a_4d36_4dcb_b06d_e8b0a4c453ad.slice/crio-2578d390f3d6e911a0881367af4cfd2d22013f59e2bdd154c087149dc34c4107 WatchSource:0}: Error finding container 2578d390f3d6e911a0881367af4cfd2d22013f59e2bdd154c087149dc34c4107: Status 404 returned error can't find the container with id 2578d390f3d6e911a0881367af4cfd2d22013f59e2bdd154c087149dc34c4107 Apr 24 19:06:44.566358 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:06:44.566317 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a4ef86a_6412_43fa_ba15_979962cfdfad.slice/crio-d0abf1b2aca43ad7ce0d61b1cd7d721253dded63ed2fc55ff3c81e9bc8f5717b WatchSource:0}: Error finding container d0abf1b2aca43ad7ce0d61b1cd7d721253dded63ed2fc55ff3c81e9bc8f5717b: Status 404 returned error can't find the container with id d0abf1b2aca43ad7ce0d61b1cd7d721253dded63ed2fc55ff3c81e9bc8f5717b Apr 24 19:06:44.587377 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.587352 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c45x4\" (UniqueName: \"kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4\") pod \"network-check-target-n6v84\" (UID: \"12b20576-da14-4ba1-926b-fed787f86bfb\") " pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:06:44.587533 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:44.587515 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:44.587593 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:44.587541 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:44.587593 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:44.587553 2583 projected.go:194] Error preparing data for projected volume kube-api-access-c45x4 for pod openshift-network-diagnostics/network-check-target-n6v84: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:44.587684 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:44.587615 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4 podName:12b20576-da14-4ba1-926b-fed787f86bfb nodeName:}" failed. No retries permitted until 2026-04-24 19:06:45.587594716 +0000 UTC m=+4.192886125 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-c45x4" (UniqueName: "kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4") pod "network-check-target-n6v84" (UID: "12b20576-da14-4ba1-926b-fed787f86bfb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:44.911085 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.910993 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 19:01:42 +0000 UTC" deadline="2027-12-26 05:12:27.61175789 +0000 UTC" Apr 24 19:06:44.911085 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:44.911039 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14650h5m42.700725956s" Apr 24 19:06:45.015042 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:45.014441 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:06:45.015042 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:45.014650 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6v84" podUID="12b20576-da14-4ba1-926b-fed787f86bfb" Apr 24 19:06:45.036544 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:45.036506 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" event={"ID":"6dc56d95-2724-41c7-beac-3fc12b5e8960","Type":"ContainerStarted","Data":"e23b5e6d63b85f4356bc367a3ca929f4939852a999e7288471bd26be5c88108d"} Apr 24 19:06:45.042393 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:45.041599 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-23.ec2.internal" event={"ID":"7361f60877a2c10988a706360ce354df","Type":"ContainerStarted","Data":"c690c15b6d2d06d27d2afc77050ffd6c6f64c4c93c8dc78c657a750410ce2f1d"} Apr 24 19:06:45.045683 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:45.045649 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xzp6g" event={"ID":"3a2993c8-f0a2-46ca-be46-e34e78416219","Type":"ContainerStarted","Data":"b276a255dee43c984e05fd3e6f37f32fc1a2e032e63cafe7d8b4bf294309bb76"} Apr 24 19:06:45.049318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:45.049283 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fpdqf" event={"ID":"1daf02a1-51a2-4eb1-a1b5-ed9667d63027","Type":"ContainerStarted","Data":"2ff2983eedcbe9a21a16499b99d97a7726d74897f09b14362c51aa6d9d8f7076"} Apr 24 19:06:45.054355 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:45.054298 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-23.ec2.internal" podStartSLOduration=3.054279108 podStartE2EDuration="3.054279108s" podCreationTimestamp="2026-04-24 19:06:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:06:45.053406457 +0000 UTC m=+3.658697876" watchObservedRunningTime="2026-04-24 19:06:45.054279108 +0000 UTC m=+3.659570526" Apr 24 19:06:45.060170 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:45.060137 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" event={"ID":"c3b9dbc7-3f43-4d25-9375-c9c4859dd641","Type":"ContainerStarted","Data":"67ca473b46f56aa02635a15e41ce33e28f346c199f902c1a0370c7eabe509309"} Apr 24 19:06:45.068689 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:45.068634 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-z56hw" event={"ID":"c4078663-d8da-42c2-8049-c863b1b49ea9","Type":"ContainerStarted","Data":"a43fb01a5026fa6cd9933e83f02694eb9aea801f87b756f5cdf8e2ef402c8e5d"} Apr 24 19:06:45.071087 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:45.071047 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdfd7" event={"ID":"2a4ef86a-6412-43fa-ba15-979962cfdfad","Type":"ContainerStarted","Data":"d0abf1b2aca43ad7ce0d61b1cd7d721253dded63ed2fc55ff3c81e9bc8f5717b"} Apr 24 19:06:45.074443 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:45.074413 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mm5fz" event={"ID":"c359622a-4d36-4dcb-b06d-e8b0a4c453ad","Type":"ContainerStarted","Data":"2578d390f3d6e911a0881367af4cfd2d22013f59e2bdd154c087149dc34c4107"} Apr 24 19:06:45.076539 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:45.076514 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lv66n" event={"ID":"39a410d7-8c61-49c0-8950-af91f35238f3","Type":"ContainerStarted","Data":"d2ff69e2a02e89d16ae4c00c234c3dcdc9613d302e225d54bf3f848aecba3f59"} Apr 24 19:06:45.495755 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:45.495715 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs\") pod \"network-metrics-daemon-f6x9g\" (UID: \"c0ea34e5-a89a-4142-83d4-e94ef986bfa4\") " pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:06:45.495941 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:45.495885 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:45.496000 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:45.495947 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs podName:c0ea34e5-a89a-4142-83d4-e94ef986bfa4 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:47.49592881 +0000 UTC m=+6.101220209 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs") pod "network-metrics-daemon-f6x9g" (UID: "c0ea34e5-a89a-4142-83d4-e94ef986bfa4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:45.596526 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:45.596429 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c45x4\" (UniqueName: \"kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4\") pod \"network-check-target-n6v84\" (UID: \"12b20576-da14-4ba1-926b-fed787f86bfb\") " pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:06:45.596687 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:45.596601 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:45.596687 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:45.596621 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:45.596687 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:45.596635 2583 projected.go:194] Error preparing data for projected volume kube-api-access-c45x4 for pod openshift-network-diagnostics/network-check-target-n6v84: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:45.596882 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:45.596700 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4 podName:12b20576-da14-4ba1-926b-fed787f86bfb nodeName:}" failed. No retries permitted until 2026-04-24 19:06:47.596679812 +0000 UTC m=+6.201971220 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-c45x4" (UniqueName: "kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4") pod "network-check-target-n6v84" (UID: "12b20576-da14-4ba1-926b-fed787f86bfb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:46.014631 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:46.014595 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:06:46.015056 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:46.014746 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6x9g" podUID="c0ea34e5-a89a-4142-83d4-e94ef986bfa4" Apr 24 19:06:46.087279 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:46.087174 2583 generic.go:358] "Generic (PLEG): container finished" podID="2c6610a84971699450282b763f465fdb" containerID="a381ed5c887cca71a1c3bcc08fe5860a71a3815ee8f4162d3306a78e0b55b9dc" exitCode=0 Apr 24 19:06:46.088115 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:46.087796 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal" event={"ID":"2c6610a84971699450282b763f465fdb","Type":"ContainerDied","Data":"a381ed5c887cca71a1c3bcc08fe5860a71a3815ee8f4162d3306a78e0b55b9dc"} Apr 24 19:06:47.014495 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:47.014211 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:06:47.014698 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:47.014601 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6v84" podUID="12b20576-da14-4ba1-926b-fed787f86bfb" Apr 24 19:06:47.095754 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:47.095716 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal" event={"ID":"2c6610a84971699450282b763f465fdb","Type":"ContainerStarted","Data":"95d0601c1ea12700afb5cb5fb9ecfc15a902d31708f88a0cb8f126c1f5bfd6a6"} Apr 24 19:06:47.113056 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:47.112615 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-23.ec2.internal" podStartSLOduration=5.112595871 podStartE2EDuration="5.112595871s" podCreationTimestamp="2026-04-24 19:06:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:06:47.112444265 +0000 UTC m=+5.717735685" watchObservedRunningTime="2026-04-24 19:06:47.112595871 +0000 UTC m=+5.717887288" Apr 24 19:06:47.513306 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:47.512684 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs\") pod \"network-metrics-daemon-f6x9g\" (UID: \"c0ea34e5-a89a-4142-83d4-e94ef986bfa4\") " pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:06:47.513306 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:47.512822 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:47.513306 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:47.512885 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs podName:c0ea34e5-a89a-4142-83d4-e94ef986bfa4 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:51.512866823 +0000 UTC m=+10.118158221 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs") pod "network-metrics-daemon-f6x9g" (UID: "c0ea34e5-a89a-4142-83d4-e94ef986bfa4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:47.613743 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:47.613124 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c45x4\" (UniqueName: \"kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4\") pod \"network-check-target-n6v84\" (UID: \"12b20576-da14-4ba1-926b-fed787f86bfb\") " pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:06:47.613743 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:47.613343 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:47.613743 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:47.613367 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:47.613743 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:47.613380 2583 projected.go:194] Error preparing data for projected volume kube-api-access-c45x4 for pod openshift-network-diagnostics/network-check-target-n6v84: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:47.613743 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:47.613442 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4 podName:12b20576-da14-4ba1-926b-fed787f86bfb nodeName:}" failed. No retries permitted until 2026-04-24 19:06:51.613421645 +0000 UTC m=+10.218713054 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-c45x4" (UniqueName: "kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4") pod "network-check-target-n6v84" (UID: "12b20576-da14-4ba1-926b-fed787f86bfb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:48.014154 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:48.013676 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:06:48.014154 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:48.013812 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6x9g" podUID="c0ea34e5-a89a-4142-83d4-e94ef986bfa4" Apr 24 19:06:49.014182 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:49.014141 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:06:49.014602 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:49.014443 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6v84" podUID="12b20576-da14-4ba1-926b-fed787f86bfb" Apr 24 19:06:50.014171 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:50.013727 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:06:50.014171 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:50.013865 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6x9g" podUID="c0ea34e5-a89a-4142-83d4-e94ef986bfa4" Apr 24 19:06:51.013714 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:51.013675 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:06:51.013885 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:51.013834 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6v84" podUID="12b20576-da14-4ba1-926b-fed787f86bfb" Apr 24 19:06:51.546193 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:51.546117 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs\") pod \"network-metrics-daemon-f6x9g\" (UID: \"c0ea34e5-a89a-4142-83d4-e94ef986bfa4\") " pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:06:51.546677 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:51.546275 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:51.546677 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:51.546339 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs podName:c0ea34e5-a89a-4142-83d4-e94ef986bfa4 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:59.546320113 +0000 UTC m=+18.151611523 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs") pod "network-metrics-daemon-f6x9g" (UID: "c0ea34e5-a89a-4142-83d4-e94ef986bfa4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:51.647321 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:51.646630 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c45x4\" (UniqueName: \"kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4\") pod \"network-check-target-n6v84\" (UID: \"12b20576-da14-4ba1-926b-fed787f86bfb\") " pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:06:51.647321 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:51.646854 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:51.647321 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:51.646876 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:51.647321 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:51.646889 2583 projected.go:194] Error preparing data for projected volume kube-api-access-c45x4 for pod openshift-network-diagnostics/network-check-target-n6v84: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:51.647321 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:51.646950 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4 podName:12b20576-da14-4ba1-926b-fed787f86bfb nodeName:}" failed. No retries permitted until 2026-04-24 19:06:59.646931328 +0000 UTC m=+18.252222747 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-c45x4" (UniqueName: "kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4") pod "network-check-target-n6v84" (UID: "12b20576-da14-4ba1-926b-fed787f86bfb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:52.014505 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:52.014473 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:06:52.014664 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:52.014578 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6x9g" podUID="c0ea34e5-a89a-4142-83d4-e94ef986bfa4" Apr 24 19:06:53.013708 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:53.013672 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:06:53.014168 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:53.013796 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6v84" podUID="12b20576-da14-4ba1-926b-fed787f86bfb" Apr 24 19:06:54.014572 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:54.014543 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:06:54.014982 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:54.014660 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6x9g" podUID="c0ea34e5-a89a-4142-83d4-e94ef986bfa4" Apr 24 19:06:55.014469 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:55.014431 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:06:55.014629 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:55.014536 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6v84" podUID="12b20576-da14-4ba1-926b-fed787f86bfb" Apr 24 19:06:56.014432 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:56.014393 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:06:56.014637 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:56.014527 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6x9g" podUID="c0ea34e5-a89a-4142-83d4-e94ef986bfa4" Apr 24 19:06:57.013944 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:57.013907 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:06:57.014127 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:57.014029 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6v84" podUID="12b20576-da14-4ba1-926b-fed787f86bfb" Apr 24 19:06:58.014268 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:58.014220 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:06:58.014692 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:58.014363 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6x9g" podUID="c0ea34e5-a89a-4142-83d4-e94ef986bfa4" Apr 24 19:06:59.014079 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:59.014040 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:06:59.014238 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:59.014162 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6v84" podUID="12b20576-da14-4ba1-926b-fed787f86bfb" Apr 24 19:06:59.601484 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:59.601450 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs\") pod \"network-metrics-daemon-f6x9g\" (UID: \"c0ea34e5-a89a-4142-83d4-e94ef986bfa4\") " pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:06:59.602033 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:59.601620 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:59.602033 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:59.601703 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs podName:c0ea34e5-a89a-4142-83d4-e94ef986bfa4 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:15.601681656 +0000 UTC m=+34.206973051 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs") pod "network-metrics-daemon-f6x9g" (UID: "c0ea34e5-a89a-4142-83d4-e94ef986bfa4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:59.702145 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:06:59.702108 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c45x4\" (UniqueName: \"kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4\") pod \"network-check-target-n6v84\" (UID: \"12b20576-da14-4ba1-926b-fed787f86bfb\") " pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:06:59.702335 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:59.702235 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:59.702335 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:59.702272 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:59.702335 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:59.702289 2583 projected.go:194] Error preparing data for projected volume kube-api-access-c45x4 for pod openshift-network-diagnostics/network-check-target-n6v84: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:59.702471 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:06:59.702347 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4 podName:12b20576-da14-4ba1-926b-fed787f86bfb nodeName:}" failed. No retries permitted until 2026-04-24 19:07:15.702329332 +0000 UTC m=+34.307620727 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-c45x4" (UniqueName: "kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4") pod "network-check-target-n6v84" (UID: "12b20576-da14-4ba1-926b-fed787f86bfb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:00.013937 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:00.013903 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:07:00.014102 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:00.014046 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6x9g" podUID="c0ea34e5-a89a-4142-83d4-e94ef986bfa4" Apr 24 19:07:01.014214 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:01.014175 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:07:01.014695 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:01.014327 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6v84" podUID="12b20576-da14-4ba1-926b-fed787f86bfb" Apr 24 19:07:02.014380 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:02.014347 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:07:02.015111 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:02.014477 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6x9g" podUID="c0ea34e5-a89a-4142-83d4-e94ef986bfa4" Apr 24 19:07:02.123771 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:02.123480 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fpdqf" event={"ID":"1daf02a1-51a2-4eb1-a1b5-ed9667d63027","Type":"ContainerStarted","Data":"67600433b44bdfb58625818de2be3bc49cd656dfdce90c4b3ac4e050c9babf9d"} Apr 24 19:07:02.126488 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:02.126445 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" event={"ID":"c3b9dbc7-3f43-4d25-9375-c9c4859dd641","Type":"ContainerStarted","Data":"35e485d5ad8cc09116af741ce83944b874fe4efcf389edcc894b78391013fb10"} Apr 24 19:07:02.128824 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:02.128705 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-z56hw" event={"ID":"c4078663-d8da-42c2-8049-c863b1b49ea9","Type":"ContainerStarted","Data":"8b9d19463cc7b758d4392ecb9df3740bfe6161d7c4e439d7b37128e45f015a8e"} Apr 24 19:07:02.133523 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:02.133495 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lv66n" event={"ID":"39a410d7-8c61-49c0-8950-af91f35238f3","Type":"ContainerStarted","Data":"85685b150825dadb33f14f33e29450a55eeb562cfbe39a0d32ad12e156ab1b5f"} Apr 24 19:07:02.156895 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:02.156748 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-fpdqf" podStartSLOduration=3.204027519 podStartE2EDuration="20.156707777s" podCreationTimestamp="2026-04-24 19:06:42 +0000 UTC" firstStartedPulling="2026-04-24 19:06:44.565606766 +0000 UTC m=+3.170898175" lastFinishedPulling="2026-04-24 19:07:01.518287037 +0000 UTC m=+20.123578433" observedRunningTime="2026-04-24 19:07:02.156560843 +0000 UTC m=+20.761852259" watchObservedRunningTime="2026-04-24 19:07:02.156707777 +0000 UTC m=+20.761999173" Apr 24 19:07:02.220415 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:02.220369 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-z56hw" podStartSLOduration=2.910694867 podStartE2EDuration="20.220353038s" podCreationTimestamp="2026-04-24 19:06:42 +0000 UTC" firstStartedPulling="2026-04-24 19:06:44.557346807 +0000 UTC m=+3.162638202" lastFinishedPulling="2026-04-24 19:07:01.867004968 +0000 UTC m=+20.472296373" observedRunningTime="2026-04-24 19:07:02.182912567 +0000 UTC m=+20.788203983" watchObservedRunningTime="2026-04-24 19:07:02.220353038 +0000 UTC m=+20.825644453" Apr 24 19:07:02.220533 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:02.220464 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lv66n" podStartSLOduration=2.877873695 podStartE2EDuration="20.220460214s" podCreationTimestamp="2026-04-24 19:06:42 +0000 UTC" firstStartedPulling="2026-04-24 19:06:44.564524764 +0000 UTC m=+3.169816169" lastFinishedPulling="2026-04-24 19:07:01.907111279 +0000 UTC m=+20.512402688" observedRunningTime="2026-04-24 19:07:02.220208075 +0000 UTC m=+20.825499493" watchObservedRunningTime="2026-04-24 19:07:02.220460214 +0000 UTC m=+20.825751674" Apr 24 19:07:03.014325 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:03.014114 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:07:03.014483 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:03.014414 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6v84" podUID="12b20576-da14-4ba1-926b-fed787f86bfb" Apr 24 19:07:03.138842 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:03.138809 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/ovn-acl-logging/0.log" Apr 24 19:07:03.139102 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:03.139080 2583 generic.go:358] "Generic (PLEG): container finished" podID="c3b9dbc7-3f43-4d25-9375-c9c4859dd641" containerID="4d7daa61e5778564c5a0744011239e4fc48914fb239ae1711afca624bb570cee" exitCode=1 Apr 24 19:07:03.139183 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:03.139154 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" event={"ID":"c3b9dbc7-3f43-4d25-9375-c9c4859dd641","Type":"ContainerStarted","Data":"98d12ac19ee804c38d542bcbefdd1cc40c1a8f261c4b67c2fcdbddfd46be1aaf"} Apr 24 19:07:03.139235 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:03.139181 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" event={"ID":"c3b9dbc7-3f43-4d25-9375-c9c4859dd641","Type":"ContainerStarted","Data":"b2d4d306010cd376c02137c11d70f3f1c75d1803d978b57d44bdcc7e98452c63"} Apr 24 19:07:03.139235 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:03.139196 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" event={"ID":"c3b9dbc7-3f43-4d25-9375-c9c4859dd641","Type":"ContainerStarted","Data":"c2786804a2edda05b74f21bc9fba7656a16e06847dfd477c86782e085b1715eb"} Apr 24 19:07:03.139235 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:03.139208 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" event={"ID":"c3b9dbc7-3f43-4d25-9375-c9c4859dd641","Type":"ContainerStarted","Data":"43ace5494ce6eb6dce2dbcdd8d5074d71ab39dfcb626c9488595a493981fa086"} Apr 24 19:07:03.139235 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:03.139221 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" event={"ID":"c3b9dbc7-3f43-4d25-9375-c9c4859dd641","Type":"ContainerDied","Data":"4d7daa61e5778564c5a0744011239e4fc48914fb239ae1711afca624bb570cee"} Apr 24 19:07:03.140522 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:03.140496 2583 generic.go:358] "Generic (PLEG): container finished" podID="2a4ef86a-6412-43fa-ba15-979962cfdfad" containerID="f52686737171c5c45a2eecddd76cdf7c48a96314b84056ed7f61a84b9c9bf662" exitCode=0 Apr 24 19:07:03.140623 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:03.140577 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdfd7" event={"ID":"2a4ef86a-6412-43fa-ba15-979962cfdfad","Type":"ContainerDied","Data":"f52686737171c5c45a2eecddd76cdf7c48a96314b84056ed7f61a84b9c9bf662"} Apr 24 19:07:03.141739 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:03.141708 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mm5fz" event={"ID":"c359622a-4d36-4dcb-b06d-e8b0a4c453ad","Type":"ContainerStarted","Data":"39e3cebafef8bdbdbe09d32447d806acccb7720b6042fbdf616603422ce7248a"} Apr 24 19:07:03.142920 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:03.142898 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" event={"ID":"6dc56d95-2724-41c7-beac-3fc12b5e8960","Type":"ContainerStarted","Data":"5e8ca0be02a6b1b048d565c01966914764767c3ff3d8f3876fca47887ff6fa46"} Apr 24 19:07:03.179966 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:03.179920 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mm5fz" podStartSLOduration=3.88043458 podStartE2EDuration="21.179906258s" podCreationTimestamp="2026-04-24 19:06:42 +0000 UTC" firstStartedPulling="2026-04-24 19:06:44.567866881 +0000 UTC m=+3.173158274" lastFinishedPulling="2026-04-24 19:07:01.867338545 +0000 UTC m=+20.472629952" observedRunningTime="2026-04-24 19:07:03.179187145 +0000 UTC m=+21.784478560" watchObservedRunningTime="2026-04-24 19:07:03.179906258 +0000 UTC m=+21.785197674" Apr 24 19:07:03.201845 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:03.201815 2583 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 19:07:03.548021 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:03.547954 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-fpdqf" Apr 24 19:07:03.936765 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:03.936622 2583 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T19:07:03.201835376Z","UUID":"f6f95d20-80f7-42bd-a9c9-d454c4c6d54b","Handler":null,"Name":"","Endpoint":""} Apr 24 19:07:03.939488 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:03.939460 2583 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 19:07:03.939651 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:03.939500 2583 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 19:07:04.013927 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:04.013894 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:07:04.014101 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:04.014044 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6x9g" podUID="c0ea34e5-a89a-4142-83d4-e94ef986bfa4" Apr 24 19:07:04.147480 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:04.147443 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" event={"ID":"6dc56d95-2724-41c7-beac-3fc12b5e8960","Type":"ContainerStarted","Data":"b7c0544abe4f48dbb96322a20f8fbe7f5163af2e9d599a10b0d50e90ef90658f"} Apr 24 19:07:04.148726 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:04.148696 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xzp6g" event={"ID":"3a2993c8-f0a2-46ca-be46-e34e78416219","Type":"ContainerStarted","Data":"9c6c73e07e0f535f149912299b7d1ee7232f41f7a2f351ca6d672850d877aa5a"} Apr 24 19:07:05.014331 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:05.014299 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:07:05.014546 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:05.014423 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6v84" podUID="12b20576-da14-4ba1-926b-fed787f86bfb" Apr 24 19:07:05.155475 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:05.155425 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" event={"ID":"6dc56d95-2724-41c7-beac-3fc12b5e8960","Type":"ContainerStarted","Data":"a453a506cbd860e1159f415a1c588e99eb4b055d3782824bde8b986854503137"} Apr 24 19:07:05.159665 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:05.159639 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/ovn-acl-logging/0.log" Apr 24 19:07:05.160104 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:05.160070 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" event={"ID":"c3b9dbc7-3f43-4d25-9375-c9c4859dd641","Type":"ContainerStarted","Data":"db26705994c9bb1574ccb3b3e4a4274a7b20de9fccf8f80d3db1465891bca6e9"} Apr 24 19:07:05.173891 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:05.173836 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2s7zm" podStartSLOduration=3.526647966 podStartE2EDuration="23.173819899s" podCreationTimestamp="2026-04-24 19:06:42 +0000 UTC" firstStartedPulling="2026-04-24 19:06:44.563231453 +0000 UTC m=+3.168522852" lastFinishedPulling="2026-04-24 19:07:04.210403373 +0000 UTC m=+22.815694785" observedRunningTime="2026-04-24 19:07:05.173173374 +0000 UTC m=+23.778464791" watchObservedRunningTime="2026-04-24 19:07:05.173819899 +0000 UTC m=+23.779111319" Apr 24 19:07:05.174271 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:05.174225 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xzp6g" podStartSLOduration=5.874297845 podStartE2EDuration="23.174214755s" podCreationTimestamp="2026-04-24 19:06:42 +0000 UTC" firstStartedPulling="2026-04-24 19:06:44.567078956 +0000 UTC m=+3.172370362" lastFinishedPulling="2026-04-24 19:07:01.866995865 +0000 UTC m=+20.472287272" observedRunningTime="2026-04-24 19:07:04.166100456 +0000 UTC m=+22.771391872" watchObservedRunningTime="2026-04-24 19:07:05.174214755 +0000 UTC m=+23.779506171" Apr 24 19:07:05.510554 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:05.510478 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-fpdqf" Apr 24 19:07:05.511193 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:05.511178 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-fpdqf" Apr 24 19:07:06.014431 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:06.014399 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:07:06.014605 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:06.014535 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6x9g" podUID="c0ea34e5-a89a-4142-83d4-e94ef986bfa4" Apr 24 19:07:06.163136 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:06.163112 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-fpdqf" Apr 24 19:07:07.013914 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:07.013875 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:07:07.014081 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:07.013990 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6v84" podUID="12b20576-da14-4ba1-926b-fed787f86bfb" Apr 24 19:07:08.014552 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:08.014510 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:07:08.015024 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:08.014628 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6x9g" podUID="c0ea34e5-a89a-4142-83d4-e94ef986bfa4" Apr 24 19:07:08.168888 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:08.168857 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/ovn-acl-logging/0.log" Apr 24 19:07:08.169220 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:08.169195 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" event={"ID":"c3b9dbc7-3f43-4d25-9375-c9c4859dd641","Type":"ContainerStarted","Data":"7509c7a2bdeb669e6dd0a00466864394e207511e7ffaf0a10e46d4be519bcdb9"} Apr 24 19:07:08.169531 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:08.169510 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:07:08.169601 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:08.169543 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:07:08.169739 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:08.169709 2583 scope.go:117] "RemoveContainer" containerID="4d7daa61e5778564c5a0744011239e4fc48914fb239ae1711afca624bb570cee" Apr 24 19:07:08.171222 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:08.171202 2583 generic.go:358] "Generic (PLEG): container finished" podID="2a4ef86a-6412-43fa-ba15-979962cfdfad" containerID="1acf19f689c951057410b9da7cc7d2a65207c72f0adb16824ec5e529d768f5d5" exitCode=0 Apr 24 19:07:08.171311 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:08.171245 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdfd7" event={"ID":"2a4ef86a-6412-43fa-ba15-979962cfdfad","Type":"ContainerDied","Data":"1acf19f689c951057410b9da7cc7d2a65207c72f0adb16824ec5e529d768f5d5"} Apr 24 19:07:08.186542 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:08.186516 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:07:09.014578 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:09.014344 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:07:09.015042 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:09.014679 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6v84" podUID="12b20576-da14-4ba1-926b-fed787f86bfb" Apr 24 19:07:09.116111 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:09.116030 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f6x9g"] Apr 24 19:07:09.116237 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:09.116162 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:07:09.116289 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:09.116275 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6x9g" podUID="c0ea34e5-a89a-4142-83d4-e94ef986bfa4" Apr 24 19:07:09.119333 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:09.119297 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-n6v84"] Apr 24 19:07:09.175914 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:09.175885 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/ovn-acl-logging/0.log" Apr 24 19:07:09.176243 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:09.176220 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" event={"ID":"c3b9dbc7-3f43-4d25-9375-c9c4859dd641","Type":"ContainerStarted","Data":"c25438f33fa3bf4ad39f1e88e3eff45a9a3b0aacabd89989bd43d86dfaf849c9"} Apr 24 19:07:09.176476 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:09.176464 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:07:09.178203 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:09.178184 2583 generic.go:358] "Generic (PLEG): container finished" podID="2a4ef86a-6412-43fa-ba15-979962cfdfad" containerID="1187bb160bb5095acadb4ed318b929a470709214cddd74d3111c011381d93e09" exitCode=0 Apr 24 19:07:09.178304 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:09.178238 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:07:09.178304 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:09.178282 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdfd7" event={"ID":"2a4ef86a-6412-43fa-ba15-979962cfdfad","Type":"ContainerDied","Data":"1187bb160bb5095acadb4ed318b929a470709214cddd74d3111c011381d93e09"} Apr 24 19:07:09.178366 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:09.178350 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6v84" podUID="12b20576-da14-4ba1-926b-fed787f86bfb" Apr 24 19:07:09.191685 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:09.191662 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:07:09.208371 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:09.208335 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" podStartSLOduration=9.837499693 podStartE2EDuration="27.208321589s" podCreationTimestamp="2026-04-24 19:06:42 +0000 UTC" firstStartedPulling="2026-04-24 19:06:44.563380757 +0000 UTC m=+3.168672154" lastFinishedPulling="2026-04-24 19:07:01.934202653 +0000 UTC m=+20.539494050" observedRunningTime="2026-04-24 19:07:09.206841244 +0000 UTC m=+27.812132659" watchObservedRunningTime="2026-04-24 19:07:09.208321589 +0000 UTC m=+27.813613005" Apr 24 19:07:10.181846 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:10.181764 2583 generic.go:358] "Generic (PLEG): container finished" podID="2a4ef86a-6412-43fa-ba15-979962cfdfad" containerID="bd6cb2f4d3bf1ff456a26d123c81cc1de3f07e597477e1f8f4fa6fd0c31e6975" exitCode=0 Apr 24 19:07:10.182292 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:10.181847 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdfd7" event={"ID":"2a4ef86a-6412-43fa-ba15-979962cfdfad","Type":"ContainerDied","Data":"bd6cb2f4d3bf1ff456a26d123c81cc1de3f07e597477e1f8f4fa6fd0c31e6975"} Apr 24 19:07:10.815383 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:10.815347 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-lclhm"] Apr 24 19:07:10.818393 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:10.818372 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lclhm" Apr 24 19:07:10.822084 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:10.822060 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-fpmcl\"" Apr 24 19:07:10.822502 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:10.822484 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 19:07:10.822663 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:10.822649 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 19:07:10.882968 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:10.882930 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1-tmp-dir\") pod \"node-resolver-lclhm\" (UID: \"23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1\") " pod="openshift-dns/node-resolver-lclhm" Apr 24 19:07:10.883143 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:10.882985 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1-hosts-file\") pod \"node-resolver-lclhm\" (UID: \"23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1\") " pod="openshift-dns/node-resolver-lclhm" Apr 24 19:07:10.883143 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:10.883028 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv6sc\" (UniqueName: \"kubernetes.io/projected/23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1-kube-api-access-kv6sc\") pod \"node-resolver-lclhm\" (UID: \"23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1\") " pod="openshift-dns/node-resolver-lclhm" Apr 24 19:07:10.983856 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:10.983824 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1-tmp-dir\") pod \"node-resolver-lclhm\" (UID: \"23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1\") " pod="openshift-dns/node-resolver-lclhm" Apr 24 19:07:10.984030 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:10.983870 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1-hosts-file\") pod \"node-resolver-lclhm\" (UID: \"23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1\") " pod="openshift-dns/node-resolver-lclhm" Apr 24 19:07:10.984030 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:10.983909 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kv6sc\" (UniqueName: \"kubernetes.io/projected/23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1-kube-api-access-kv6sc\") pod \"node-resolver-lclhm\" (UID: \"23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1\") " pod="openshift-dns/node-resolver-lclhm" Apr 24 19:07:10.984185 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:10.984161 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1-tmp-dir\") pod \"node-resolver-lclhm\" (UID: \"23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1\") " pod="openshift-dns/node-resolver-lclhm" Apr 24 19:07:10.984277 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:10.984202 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1-hosts-file\") pod \"node-resolver-lclhm\" (UID: \"23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1\") " pod="openshift-dns/node-resolver-lclhm" Apr 24 19:07:10.996414 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:10.996389 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv6sc\" (UniqueName: \"kubernetes.io/projected/23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1-kube-api-access-kv6sc\") pod \"node-resolver-lclhm\" (UID: \"23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1\") " pod="openshift-dns/node-resolver-lclhm" Apr 24 19:07:11.014321 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:11.014284 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:07:11.014471 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:11.014290 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:07:11.014471 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:11.014415 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6v84" podUID="12b20576-da14-4ba1-926b-fed787f86bfb" Apr 24 19:07:11.014551 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:11.014520 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6x9g" podUID="c0ea34e5-a89a-4142-83d4-e94ef986bfa4" Apr 24 19:07:11.129514 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:11.129434 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lclhm" Apr 24 19:07:11.186994 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:11.186720 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lclhm" event={"ID":"23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1","Type":"ContainerStarted","Data":"a674dc6cb64f6b6834b63c3c43e5ceea950b64f9d34b4f03338aa86ae650138b"} Apr 24 19:07:12.188741 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:12.188703 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lclhm" event={"ID":"23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1","Type":"ContainerStarted","Data":"2d1075c83c813b5da70e63769012b7ec30e56274493a46b663094eefbee2c3cc"} Apr 24 19:07:12.205781 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:12.205722 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lclhm" podStartSLOduration=2.20570462 podStartE2EDuration="2.20570462s" podCreationTimestamp="2026-04-24 19:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:07:12.205451508 +0000 UTC m=+30.810742924" watchObservedRunningTime="2026-04-24 19:07:12.20570462 +0000 UTC m=+30.810996037" Apr 24 19:07:13.013554 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:13.013519 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:07:13.013705 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:13.013563 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:07:13.013705 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:13.013644 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6v84" podUID="12b20576-da14-4ba1-926b-fed787f86bfb" Apr 24 19:07:13.013805 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:13.013733 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f6x9g" podUID="c0ea34e5-a89a-4142-83d4-e94ef986bfa4" Apr 24 19:07:14.744091 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.744055 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-23.ec2.internal" event="NodeReady" Apr 24 19:07:14.744451 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.744208 2583 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 19:07:14.790186 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.790149 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zl4tq"] Apr 24 19:07:14.795183 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.795153 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:14.797328 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.797305 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p67rd"] Apr 24 19:07:14.798015 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.797993 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 19:07:14.798085 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.798063 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 19:07:14.798352 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.798335 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rh56f\"" Apr 24 19:07:14.800825 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.800800 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p67rd" Apr 24 19:07:14.803377 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.803349 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zl4tq"] Apr 24 19:07:14.803833 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.803810 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 19:07:14.804200 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.804024 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 19:07:14.804200 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.804142 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l8pmm\"" Apr 24 19:07:14.805361 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.805015 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 19:07:14.815766 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.814708 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p67rd"] Apr 24 19:07:14.910932 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.910840 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hx9v\" (UniqueName: \"kubernetes.io/projected/e08e9f5e-5277-4567-a799-97a88665243a-kube-api-access-9hx9v\") pod \"ingress-canary-p67rd\" (UID: \"e08e9f5e-5277-4567-a799-97a88665243a\") " pod="openshift-ingress-canary/ingress-canary-p67rd" Apr 24 19:07:14.911172 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.910939 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls\") pod \"dns-default-zl4tq\" (UID: \"c680e87e-93f7-42bb-8645-95c2ba5b415e\") " pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:14.911172 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.910962 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c680e87e-93f7-42bb-8645-95c2ba5b415e-config-volume\") pod \"dns-default-zl4tq\" (UID: \"c680e87e-93f7-42bb-8645-95c2ba5b415e\") " pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:14.911172 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.910990 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gv8r\" (UniqueName: \"kubernetes.io/projected/c680e87e-93f7-42bb-8645-95c2ba5b415e-kube-api-access-7gv8r\") pod \"dns-default-zl4tq\" (UID: \"c680e87e-93f7-42bb-8645-95c2ba5b415e\") " pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:14.911172 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.911017 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert\") pod \"ingress-canary-p67rd\" (UID: \"e08e9f5e-5277-4567-a799-97a88665243a\") " pod="openshift-ingress-canary/ingress-canary-p67rd" Apr 24 19:07:14.911172 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:14.911100 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c680e87e-93f7-42bb-8645-95c2ba5b415e-tmp-dir\") pod \"dns-default-zl4tq\" (UID: \"c680e87e-93f7-42bb-8645-95c2ba5b415e\") " pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:15.011523 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.011475 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gv8r\" (UniqueName: \"kubernetes.io/projected/c680e87e-93f7-42bb-8645-95c2ba5b415e-kube-api-access-7gv8r\") pod \"dns-default-zl4tq\" (UID: \"c680e87e-93f7-42bb-8645-95c2ba5b415e\") " pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:15.011523 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.011526 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert\") pod \"ingress-canary-p67rd\" (UID: \"e08e9f5e-5277-4567-a799-97a88665243a\") " pod="openshift-ingress-canary/ingress-canary-p67rd" Apr 24 19:07:15.011758 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.011560 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c680e87e-93f7-42bb-8645-95c2ba5b415e-tmp-dir\") pod \"dns-default-zl4tq\" (UID: \"c680e87e-93f7-42bb-8645-95c2ba5b415e\") " pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:15.011758 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.011591 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hx9v\" (UniqueName: \"kubernetes.io/projected/e08e9f5e-5277-4567-a799-97a88665243a-kube-api-access-9hx9v\") pod \"ingress-canary-p67rd\" (UID: \"e08e9f5e-5277-4567-a799-97a88665243a\") " pod="openshift-ingress-canary/ingress-canary-p67rd" Apr 24 19:07:15.011758 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.011651 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls\") pod \"dns-default-zl4tq\" (UID: \"c680e87e-93f7-42bb-8645-95c2ba5b415e\") " pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:15.011758 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.011680 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c680e87e-93f7-42bb-8645-95c2ba5b415e-config-volume\") pod \"dns-default-zl4tq\" (UID: \"c680e87e-93f7-42bb-8645-95c2ba5b415e\") " pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:15.011758 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:15.011680 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:15.011997 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:15.011765 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert podName:e08e9f5e-5277-4567-a799-97a88665243a nodeName:}" failed. No retries permitted until 2026-04-24 19:07:15.51174345 +0000 UTC m=+34.117034847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert") pod "ingress-canary-p67rd" (UID: "e08e9f5e-5277-4567-a799-97a88665243a") : secret "canary-serving-cert" not found Apr 24 19:07:15.011997 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:15.011789 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:15.011997 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:15.011863 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls podName:c680e87e-93f7-42bb-8645-95c2ba5b415e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:15.511844642 +0000 UTC m=+34.117136052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls") pod "dns-default-zl4tq" (UID: "c680e87e-93f7-42bb-8645-95c2ba5b415e") : secret "dns-default-metrics-tls" not found Apr 24 19:07:15.011997 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.011965 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c680e87e-93f7-42bb-8645-95c2ba5b415e-tmp-dir\") pod \"dns-default-zl4tq\" (UID: \"c680e87e-93f7-42bb-8645-95c2ba5b415e\") " pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:15.012299 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.012281 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c680e87e-93f7-42bb-8645-95c2ba5b415e-config-volume\") pod \"dns-default-zl4tq\" (UID: \"c680e87e-93f7-42bb-8645-95c2ba5b415e\") " pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:15.014698 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.014499 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:07:15.015229 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.015065 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:07:15.017589 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.017564 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 19:07:15.017807 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.017576 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-h4gdd\"" Apr 24 19:07:15.017912 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.017830 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 19:07:15.017912 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.017890 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 19:07:15.018024 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.017834 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tjm4v\"" Apr 24 19:07:15.025003 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.024965 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gv8r\" (UniqueName: \"kubernetes.io/projected/c680e87e-93f7-42bb-8645-95c2ba5b415e-kube-api-access-7gv8r\") pod \"dns-default-zl4tq\" (UID: \"c680e87e-93f7-42bb-8645-95c2ba5b415e\") " pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:15.025101 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.025050 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hx9v\" (UniqueName: \"kubernetes.io/projected/e08e9f5e-5277-4567-a799-97a88665243a-kube-api-access-9hx9v\") pod \"ingress-canary-p67rd\" (UID: \"e08e9f5e-5277-4567-a799-97a88665243a\") " pod="openshift-ingress-canary/ingress-canary-p67rd" Apr 24 19:07:15.514491 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.514456 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert\") pod \"ingress-canary-p67rd\" (UID: \"e08e9f5e-5277-4567-a799-97a88665243a\") " pod="openshift-ingress-canary/ingress-canary-p67rd" Apr 24 19:07:15.514670 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.514535 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls\") pod \"dns-default-zl4tq\" (UID: \"c680e87e-93f7-42bb-8645-95c2ba5b415e\") " pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:15.514670 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:15.514619 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:15.514670 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:15.514619 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:15.514810 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:15.514688 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls podName:c680e87e-93f7-42bb-8645-95c2ba5b415e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:16.514669092 +0000 UTC m=+35.119960487 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls") pod "dns-default-zl4tq" (UID: "c680e87e-93f7-42bb-8645-95c2ba5b415e") : secret "dns-default-metrics-tls" not found Apr 24 19:07:15.514810 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:15.514702 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert podName:e08e9f5e-5277-4567-a799-97a88665243a nodeName:}" failed. No retries permitted until 2026-04-24 19:07:16.514695424 +0000 UTC m=+35.119986818 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert") pod "ingress-canary-p67rd" (UID: "e08e9f5e-5277-4567-a799-97a88665243a") : secret "canary-serving-cert" not found Apr 24 19:07:15.615625 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.615583 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs\") pod \"network-metrics-daemon-f6x9g\" (UID: \"c0ea34e5-a89a-4142-83d4-e94ef986bfa4\") " pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:07:15.615804 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:15.615737 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 19:07:15.615876 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:15.615809 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs podName:c0ea34e5-a89a-4142-83d4-e94ef986bfa4 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:47.615788318 +0000 UTC m=+66.221079725 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs") pod "network-metrics-daemon-f6x9g" (UID: "c0ea34e5-a89a-4142-83d4-e94ef986bfa4") : secret "metrics-daemon-secret" not found Apr 24 19:07:15.716904 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.716866 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c45x4\" (UniqueName: \"kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4\") pod \"network-check-target-n6v84\" (UID: \"12b20576-da14-4ba1-926b-fed787f86bfb\") " pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:07:15.719579 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.719557 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c45x4\" (UniqueName: \"kubernetes.io/projected/12b20576-da14-4ba1-926b-fed787f86bfb-kube-api-access-c45x4\") pod \"network-check-target-n6v84\" (UID: \"12b20576-da14-4ba1-926b-fed787f86bfb\") " pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:07:15.938532 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:15.938456 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:07:16.276377 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:16.276349 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-n6v84"] Apr 24 19:07:16.279649 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:07:16.279621 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12b20576_da14_4ba1_926b_fed787f86bfb.slice/crio-d8d1409fa248a47e44bfb5ddc1d3c5fa2af57220c4598df660dd21923687029c WatchSource:0}: Error finding container d8d1409fa248a47e44bfb5ddc1d3c5fa2af57220c4598df660dd21923687029c: Status 404 returned error can't find the container with id d8d1409fa248a47e44bfb5ddc1d3c5fa2af57220c4598df660dd21923687029c Apr 24 19:07:16.525295 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:16.525051 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls\") pod \"dns-default-zl4tq\" (UID: \"c680e87e-93f7-42bb-8645-95c2ba5b415e\") " pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:16.525452 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:16.525336 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert\") pod \"ingress-canary-p67rd\" (UID: \"e08e9f5e-5277-4567-a799-97a88665243a\") " pod="openshift-ingress-canary/ingress-canary-p67rd" Apr 24 19:07:16.525452 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:16.525198 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:16.525452 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:16.525420 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls podName:c680e87e-93f7-42bb-8645-95c2ba5b415e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:18.525400485 +0000 UTC m=+37.130691896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls") pod "dns-default-zl4tq" (UID: "c680e87e-93f7-42bb-8645-95c2ba5b415e") : secret "dns-default-metrics-tls" not found Apr 24 19:07:16.525559 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:16.525472 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:16.525559 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:16.525513 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert podName:e08e9f5e-5277-4567-a799-97a88665243a nodeName:}" failed. No retries permitted until 2026-04-24 19:07:18.525501652 +0000 UTC m=+37.130793051 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert") pod "ingress-canary-p67rd" (UID: "e08e9f5e-5277-4567-a799-97a88665243a") : secret "canary-serving-cert" not found Apr 24 19:07:17.204332 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:17.204275 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-n6v84" event={"ID":"12b20576-da14-4ba1-926b-fed787f86bfb","Type":"ContainerStarted","Data":"d8d1409fa248a47e44bfb5ddc1d3c5fa2af57220c4598df660dd21923687029c"} Apr 24 19:07:17.207136 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:17.207104 2583 generic.go:358] "Generic (PLEG): container finished" podID="2a4ef86a-6412-43fa-ba15-979962cfdfad" containerID="9b01f65909e475cd99143a3d8913277c9dcce60e711ad553742984b9d9ebaa61" exitCode=0 Apr 24 19:07:17.207268 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:17.207160 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdfd7" event={"ID":"2a4ef86a-6412-43fa-ba15-979962cfdfad","Type":"ContainerDied","Data":"9b01f65909e475cd99143a3d8913277c9dcce60e711ad553742984b9d9ebaa61"} Apr 24 19:07:18.214579 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:18.214542 2583 generic.go:358] "Generic (PLEG): container finished" podID="2a4ef86a-6412-43fa-ba15-979962cfdfad" containerID="c86fa7de70490081d0c199fb7163e5ff2b92e4d51ca6490834436a0153aa5ba6" exitCode=0 Apr 24 19:07:18.215022 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:18.214600 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdfd7" event={"ID":"2a4ef86a-6412-43fa-ba15-979962cfdfad","Type":"ContainerDied","Data":"c86fa7de70490081d0c199fb7163e5ff2b92e4d51ca6490834436a0153aa5ba6"} Apr 24 19:07:18.542653 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:18.542607 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls\") pod \"dns-default-zl4tq\" (UID: \"c680e87e-93f7-42bb-8645-95c2ba5b415e\") " pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:18.542853 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:18.542678 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert\") pod \"ingress-canary-p67rd\" (UID: \"e08e9f5e-5277-4567-a799-97a88665243a\") " pod="openshift-ingress-canary/ingress-canary-p67rd" Apr 24 19:07:18.542853 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:18.542827 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:18.542977 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:18.542892 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert podName:e08e9f5e-5277-4567-a799-97a88665243a nodeName:}" failed. No retries permitted until 2026-04-24 19:07:22.542872741 +0000 UTC m=+41.148164135 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert") pod "ingress-canary-p67rd" (UID: "e08e9f5e-5277-4567-a799-97a88665243a") : secret "canary-serving-cert" not found Apr 24 19:07:18.543368 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:18.543341 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:18.543476 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:18.543399 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls podName:c680e87e-93f7-42bb-8645-95c2ba5b415e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:22.54338445 +0000 UTC m=+41.148675844 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls") pod "dns-default-zl4tq" (UID: "c680e87e-93f7-42bb-8645-95c2ba5b415e") : secret "dns-default-metrics-tls" not found Apr 24 19:07:20.222732 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:20.222692 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-n6v84" event={"ID":"12b20576-da14-4ba1-926b-fed787f86bfb","Type":"ContainerStarted","Data":"e4320ac3968b738d96f58b1cc0c3a99d907a2cc8d99f95ce26a7113797d954eb"} Apr 24 19:07:20.223148 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:20.222872 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:07:20.225750 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:20.225726 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qdfd7" event={"ID":"2a4ef86a-6412-43fa-ba15-979962cfdfad","Type":"ContainerStarted","Data":"06f758bd275b2b211affe69e7b1dbe8d302941f908022a655530210488366304"} Apr 24 19:07:20.243656 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:20.243588 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-n6v84" podStartSLOduration=35.119502524 podStartE2EDuration="38.243573832s" podCreationTimestamp="2026-04-24 19:06:42 +0000 UTC" firstStartedPulling="2026-04-24 19:07:16.28207973 +0000 UTC m=+34.887371124" lastFinishedPulling="2026-04-24 19:07:19.406151034 +0000 UTC m=+38.011442432" observedRunningTime="2026-04-24 19:07:20.243377562 +0000 UTC m=+38.848668981" watchObservedRunningTime="2026-04-24 19:07:20.243573832 +0000 UTC m=+38.848865296" Apr 24 19:07:20.274787 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:20.274734 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qdfd7" podStartSLOduration=6.701171274 podStartE2EDuration="38.274720922s" podCreationTimestamp="2026-04-24 19:06:42 +0000 UTC" firstStartedPulling="2026-04-24 19:06:44.568197655 +0000 UTC m=+3.173489052" lastFinishedPulling="2026-04-24 19:07:16.141747307 +0000 UTC m=+34.747038700" observedRunningTime="2026-04-24 19:07:20.270600784 +0000 UTC m=+38.875892202" watchObservedRunningTime="2026-04-24 19:07:20.274720922 +0000 UTC m=+38.880012337" Apr 24 19:07:22.569223 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:22.569181 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert\") pod \"ingress-canary-p67rd\" (UID: \"e08e9f5e-5277-4567-a799-97a88665243a\") " pod="openshift-ingress-canary/ingress-canary-p67rd" Apr 24 19:07:22.569624 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:22.569244 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls\") pod \"dns-default-zl4tq\" (UID: \"c680e87e-93f7-42bb-8645-95c2ba5b415e\") " pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:22.569624 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:22.569353 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:22.569624 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:22.569412 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls podName:c680e87e-93f7-42bb-8645-95c2ba5b415e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:30.569396203 +0000 UTC m=+49.174687597 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls") pod "dns-default-zl4tq" (UID: "c680e87e-93f7-42bb-8645-95c2ba5b415e") : secret "dns-default-metrics-tls" not found Apr 24 19:07:22.569624 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:22.569350 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:22.569624 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:22.569500 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert podName:e08e9f5e-5277-4567-a799-97a88665243a nodeName:}" failed. No retries permitted until 2026-04-24 19:07:30.569485854 +0000 UTC m=+49.174777251 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert") pod "ingress-canary-p67rd" (UID: "e08e9f5e-5277-4567-a799-97a88665243a") : secret "canary-serving-cert" not found Apr 24 19:07:23.160796 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:23.160762 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-rlrgb"] Apr 24 19:07:23.164805 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:23.164788 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rlrgb" Apr 24 19:07:23.169088 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:23.169066 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 19:07:23.169693 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:23.169673 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-qsf2h\"" Apr 24 19:07:23.169693 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:23.169688 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 19:07:23.176662 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:23.176636 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-rlrgb"] Apr 24 19:07:23.273732 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:23.273687 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7zrm\" (UniqueName: \"kubernetes.io/projected/21f31c0d-2e5a-4ef1-bb98-2dbf4e3074cb-kube-api-access-c7zrm\") pod \"migrator-74bb7799d9-rlrgb\" (UID: \"21f31c0d-2e5a-4ef1-bb98-2dbf4e3074cb\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rlrgb" Apr 24 19:07:23.374334 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:23.374299 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7zrm\" (UniqueName: \"kubernetes.io/projected/21f31c0d-2e5a-4ef1-bb98-2dbf4e3074cb-kube-api-access-c7zrm\") pod \"migrator-74bb7799d9-rlrgb\" (UID: \"21f31c0d-2e5a-4ef1-bb98-2dbf4e3074cb\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rlrgb" Apr 24 19:07:23.385084 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:23.385050 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7zrm\" (UniqueName: \"kubernetes.io/projected/21f31c0d-2e5a-4ef1-bb98-2dbf4e3074cb-kube-api-access-c7zrm\") pod \"migrator-74bb7799d9-rlrgb\" (UID: \"21f31c0d-2e5a-4ef1-bb98-2dbf4e3074cb\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rlrgb" Apr 24 19:07:23.473230 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:23.473151 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rlrgb" Apr 24 19:07:23.612391 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:23.612351 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-rlrgb"] Apr 24 19:07:23.616124 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:07:23.616086 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21f31c0d_2e5a_4ef1_bb98_2dbf4e3074cb.slice/crio-0b4530883af26119d62ff0ea7e57a53ab9988a687e9cac1771108f494a14f488 WatchSource:0}: Error finding container 0b4530883af26119d62ff0ea7e57a53ab9988a687e9cac1771108f494a14f488: Status 404 returned error can't find the container with id 0b4530883af26119d62ff0ea7e57a53ab9988a687e9cac1771108f494a14f488 Apr 24 19:07:24.233784 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.233752 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rlrgb" event={"ID":"21f31c0d-2e5a-4ef1-bb98-2dbf4e3074cb","Type":"ContainerStarted","Data":"0b4530883af26119d62ff0ea7e57a53ab9988a687e9cac1771108f494a14f488"} Apr 24 19:07:24.592285 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.592243 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r"] Apr 24 19:07:24.596039 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.596019 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r" Apr 24 19:07:24.599797 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.599774 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 19:07:24.599986 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.599967 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 19:07:24.600117 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.600100 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 19:07:24.601172 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.601148 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 19:07:24.607381 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.607356 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r"] Apr 24 19:07:24.685061 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.685024 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/58c94ac9-ee49-4ff1-b593-60cf8fc715e9-klusterlet-config\") pod \"klusterlet-addon-workmgr-69b9c7976f-lcg4r\" (UID: \"58c94ac9-ee49-4ff1-b593-60cf8fc715e9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r" Apr 24 19:07:24.685470 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.685109 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/58c94ac9-ee49-4ff1-b593-60cf8fc715e9-tmp\") pod \"klusterlet-addon-workmgr-69b9c7976f-lcg4r\" (UID: \"58c94ac9-ee49-4ff1-b593-60cf8fc715e9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r" Apr 24 19:07:24.685470 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.685150 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nm58\" (UniqueName: \"kubernetes.io/projected/58c94ac9-ee49-4ff1-b593-60cf8fc715e9-kube-api-access-2nm58\") pod \"klusterlet-addon-workmgr-69b9c7976f-lcg4r\" (UID: \"58c94ac9-ee49-4ff1-b593-60cf8fc715e9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r" Apr 24 19:07:24.694272 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.694225 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-gdkhw"] Apr 24 19:07:24.697675 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.697623 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-gdkhw" Apr 24 19:07:24.701541 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.701519 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-vphpr\"" Apr 24 19:07:24.701675 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.701521 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 19:07:24.701802 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.701788 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 19:07:24.701997 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.701984 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 19:07:24.702657 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.702639 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 19:07:24.711966 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.711943 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-gdkhw"] Apr 24 19:07:24.759800 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.759773 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lclhm_23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1/dns-node-resolver/0.log" Apr 24 19:07:24.786098 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.786068 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/58c94ac9-ee49-4ff1-b593-60cf8fc715e9-klusterlet-config\") pod \"klusterlet-addon-workmgr-69b9c7976f-lcg4r\" (UID: \"58c94ac9-ee49-4ff1-b593-60cf8fc715e9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r" Apr 24 19:07:24.786239 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.786105 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a54f3e09-4702-4ddf-a7ff-29fce62b2b04-signing-cabundle\") pod \"service-ca-865cb79987-gdkhw\" (UID: \"a54f3e09-4702-4ddf-a7ff-29fce62b2b04\") " pod="openshift-service-ca/service-ca-865cb79987-gdkhw" Apr 24 19:07:24.786239 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.786129 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a54f3e09-4702-4ddf-a7ff-29fce62b2b04-signing-key\") pod \"service-ca-865cb79987-gdkhw\" (UID: \"a54f3e09-4702-4ddf-a7ff-29fce62b2b04\") " pod="openshift-service-ca/service-ca-865cb79987-gdkhw" Apr 24 19:07:24.786239 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.786172 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/58c94ac9-ee49-4ff1-b593-60cf8fc715e9-tmp\") pod \"klusterlet-addon-workmgr-69b9c7976f-lcg4r\" (UID: \"58c94ac9-ee49-4ff1-b593-60cf8fc715e9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r" Apr 24 19:07:24.786239 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.786218 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g77rw\" (UniqueName: \"kubernetes.io/projected/a54f3e09-4702-4ddf-a7ff-29fce62b2b04-kube-api-access-g77rw\") pod \"service-ca-865cb79987-gdkhw\" (UID: \"a54f3e09-4702-4ddf-a7ff-29fce62b2b04\") " pod="openshift-service-ca/service-ca-865cb79987-gdkhw" Apr 24 19:07:24.786441 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.786290 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nm58\" (UniqueName: \"kubernetes.io/projected/58c94ac9-ee49-4ff1-b593-60cf8fc715e9-kube-api-access-2nm58\") pod \"klusterlet-addon-workmgr-69b9c7976f-lcg4r\" (UID: \"58c94ac9-ee49-4ff1-b593-60cf8fc715e9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r" Apr 24 19:07:24.786486 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.786469 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/58c94ac9-ee49-4ff1-b593-60cf8fc715e9-tmp\") pod \"klusterlet-addon-workmgr-69b9c7976f-lcg4r\" (UID: \"58c94ac9-ee49-4ff1-b593-60cf8fc715e9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r" Apr 24 19:07:24.788703 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.788674 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/58c94ac9-ee49-4ff1-b593-60cf8fc715e9-klusterlet-config\") pod \"klusterlet-addon-workmgr-69b9c7976f-lcg4r\" (UID: \"58c94ac9-ee49-4ff1-b593-60cf8fc715e9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r" Apr 24 19:07:24.809118 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.809087 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nm58\" (UniqueName: \"kubernetes.io/projected/58c94ac9-ee49-4ff1-b593-60cf8fc715e9-kube-api-access-2nm58\") pod \"klusterlet-addon-workmgr-69b9c7976f-lcg4r\" (UID: \"58c94ac9-ee49-4ff1-b593-60cf8fc715e9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r" Apr 24 19:07:24.855842 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.855806 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-85f79f764-d9m27"] Apr 24 19:07:24.858684 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.858669 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:24.861321 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.861296 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 19:07:24.861455 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.861296 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 19:07:24.861455 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.861356 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 19:07:24.861626 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.861606 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-czn8s\"" Apr 24 19:07:24.868041 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.868004 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 19:07:24.869697 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.869675 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-85f79f764-d9m27"] Apr 24 19:07:24.887533 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.887507 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a54f3e09-4702-4ddf-a7ff-29fce62b2b04-signing-cabundle\") pod \"service-ca-865cb79987-gdkhw\" (UID: \"a54f3e09-4702-4ddf-a7ff-29fce62b2b04\") " pod="openshift-service-ca/service-ca-865cb79987-gdkhw" Apr 24 19:07:24.887660 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.887551 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a54f3e09-4702-4ddf-a7ff-29fce62b2b04-signing-key\") pod \"service-ca-865cb79987-gdkhw\" (UID: \"a54f3e09-4702-4ddf-a7ff-29fce62b2b04\") " pod="openshift-service-ca/service-ca-865cb79987-gdkhw" Apr 24 19:07:24.887660 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.887610 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g77rw\" (UniqueName: \"kubernetes.io/projected/a54f3e09-4702-4ddf-a7ff-29fce62b2b04-kube-api-access-g77rw\") pod \"service-ca-865cb79987-gdkhw\" (UID: \"a54f3e09-4702-4ddf-a7ff-29fce62b2b04\") " pod="openshift-service-ca/service-ca-865cb79987-gdkhw" Apr 24 19:07:24.888168 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.888146 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a54f3e09-4702-4ddf-a7ff-29fce62b2b04-signing-cabundle\") pod \"service-ca-865cb79987-gdkhw\" (UID: \"a54f3e09-4702-4ddf-a7ff-29fce62b2b04\") " pod="openshift-service-ca/service-ca-865cb79987-gdkhw" Apr 24 19:07:24.890063 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.890044 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a54f3e09-4702-4ddf-a7ff-29fce62b2b04-signing-key\") pod \"service-ca-865cb79987-gdkhw\" (UID: \"a54f3e09-4702-4ddf-a7ff-29fce62b2b04\") " pod="openshift-service-ca/service-ca-865cb79987-gdkhw" Apr 24 19:07:24.901790 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.901766 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g77rw\" (UniqueName: \"kubernetes.io/projected/a54f3e09-4702-4ddf-a7ff-29fce62b2b04-kube-api-access-g77rw\") pod \"service-ca-865cb79987-gdkhw\" (UID: \"a54f3e09-4702-4ddf-a7ff-29fce62b2b04\") " pod="openshift-service-ca/service-ca-865cb79987-gdkhw" Apr 24 19:07:24.907585 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.907570 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r" Apr 24 19:07:24.988660 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.988629 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-certificates\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:24.988885 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.988671 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ed18fab9-300d-43bb-9afb-6e4c9b597e56-image-registry-private-configuration\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:24.988885 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.988715 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ed18fab9-300d-43bb-9afb-6e4c9b597e56-installation-pull-secrets\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:24.988885 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.988739 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:24.988885 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.988754 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-bound-sa-token\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:24.988885 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.988824 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed18fab9-300d-43bb-9afb-6e4c9b597e56-trusted-ca\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:24.988885 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.988863 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ed18fab9-300d-43bb-9afb-6e4c9b597e56-ca-trust-extracted\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:24.989206 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:24.988937 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nntmn\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-kube-api-access-nntmn\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:25.006698 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.006663 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-gdkhw" Apr 24 19:07:25.025430 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.025394 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r"] Apr 24 19:07:25.028385 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:07:25.028356 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58c94ac9_ee49_4ff1_b593_60cf8fc715e9.slice/crio-5187908fb4f22e82bb433968ad06082c80e1f3230f8bf4842a8e35ae1cb3d442 WatchSource:0}: Error finding container 5187908fb4f22e82bb433968ad06082c80e1f3230f8bf4842a8e35ae1cb3d442: Status 404 returned error can't find the container with id 5187908fb4f22e82bb433968ad06082c80e1f3230f8bf4842a8e35ae1cb3d442 Apr 24 19:07:25.089758 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.089722 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nntmn\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-kube-api-access-nntmn\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:25.089758 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.089760 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-certificates\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:25.089991 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.089783 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ed18fab9-300d-43bb-9afb-6e4c9b597e56-image-registry-private-configuration\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:25.089991 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.089927 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ed18fab9-300d-43bb-9afb-6e4c9b597e56-installation-pull-secrets\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:25.089991 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.089962 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:25.089991 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.089988 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-bound-sa-token\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:25.090210 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.090047 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed18fab9-300d-43bb-9afb-6e4c9b597e56-trusted-ca\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:25.090210 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.090084 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ed18fab9-300d-43bb-9afb-6e4c9b597e56-ca-trust-extracted\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:25.090430 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:25.090402 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:07:25.090430 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.090426 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ed18fab9-300d-43bb-9afb-6e4c9b597e56-ca-trust-extracted\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:25.090611 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:25.090430 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85f79f764-d9m27: secret "image-registry-tls" not found Apr 24 19:07:25.090611 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.090457 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-certificates\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:25.090611 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:25.090522 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls podName:ed18fab9-300d-43bb-9afb-6e4c9b597e56 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:25.590502291 +0000 UTC m=+44.195793692 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls") pod "image-registry-85f79f764-d9m27" (UID: "ed18fab9-300d-43bb-9afb-6e4c9b597e56") : secret "image-registry-tls" not found Apr 24 19:07:25.091091 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.091073 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed18fab9-300d-43bb-9afb-6e4c9b597e56-trusted-ca\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:25.092609 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.092577 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ed18fab9-300d-43bb-9afb-6e4c9b597e56-installation-pull-secrets\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:25.092809 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.092791 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ed18fab9-300d-43bb-9afb-6e4c9b597e56-image-registry-private-configuration\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:25.100548 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.100519 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-bound-sa-token\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:25.100840 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.100815 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nntmn\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-kube-api-access-nntmn\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:25.125849 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.125818 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-gdkhw"] Apr 24 19:07:25.129554 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:07:25.129516 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda54f3e09_4702_4ddf_a7ff_29fce62b2b04.slice/crio-920e4a14226b3e257f292e9bfb61377c43e49b2c5320378badff920889a87490 WatchSource:0}: Error finding container 920e4a14226b3e257f292e9bfb61377c43e49b2c5320378badff920889a87490: Status 404 returned error can't find the container with id 920e4a14226b3e257f292e9bfb61377c43e49b2c5320378badff920889a87490 Apr 24 19:07:25.237301 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.237237 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rlrgb" event={"ID":"21f31c0d-2e5a-4ef1-bb98-2dbf4e3074cb","Type":"ContainerStarted","Data":"49d4aab200bb119802f2614e4550032b8ae339449873454ca53bdec484c29b06"} Apr 24 19:07:25.237301 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.237299 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rlrgb" event={"ID":"21f31c0d-2e5a-4ef1-bb98-2dbf4e3074cb","Type":"ContainerStarted","Data":"5e7e7eed550467fd1cc657b11c91e33300e42286b0632cc9e8e02fa4e2d87756"} Apr 24 19:07:25.238111 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.238083 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r" event={"ID":"58c94ac9-ee49-4ff1-b593-60cf8fc715e9","Type":"ContainerStarted","Data":"5187908fb4f22e82bb433968ad06082c80e1f3230f8bf4842a8e35ae1cb3d442"} Apr 24 19:07:25.238980 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.238955 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-gdkhw" event={"ID":"a54f3e09-4702-4ddf-a7ff-29fce62b2b04","Type":"ContainerStarted","Data":"920e4a14226b3e257f292e9bfb61377c43e49b2c5320378badff920889a87490"} Apr 24 19:07:25.258695 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.258643 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rlrgb" podStartSLOduration=1.364736965 podStartE2EDuration="2.258627024s" podCreationTimestamp="2026-04-24 19:07:23 +0000 UTC" firstStartedPulling="2026-04-24 19:07:23.618280862 +0000 UTC m=+42.223572257" lastFinishedPulling="2026-04-24 19:07:24.512170923 +0000 UTC m=+43.117462316" observedRunningTime="2026-04-24 19:07:25.257553208 +0000 UTC m=+43.862844623" watchObservedRunningTime="2026-04-24 19:07:25.258627024 +0000 UTC m=+43.863918441" Apr 24 19:07:25.594926 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.594879 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:25.595166 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:25.595044 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:07:25.595166 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:25.595068 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85f79f764-d9m27: secret "image-registry-tls" not found Apr 24 19:07:25.595166 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:25.595137 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls podName:ed18fab9-300d-43bb-9afb-6e4c9b597e56 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:26.595111155 +0000 UTC m=+45.200402552 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls") pod "image-registry-85f79f764-d9m27" (UID: "ed18fab9-300d-43bb-9afb-6e4c9b597e56") : secret "image-registry-tls" not found Apr 24 19:07:25.957998 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:25.957803 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mm5fz_c359622a-4d36-4dcb-b06d-e8b0a4c453ad/node-ca/0.log" Apr 24 19:07:26.603873 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:26.603831 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:26.604031 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:26.604014 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:07:26.604072 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:26.604036 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85f79f764-d9m27: secret "image-registry-tls" not found Apr 24 19:07:26.604128 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:26.604106 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls podName:ed18fab9-300d-43bb-9afb-6e4c9b597e56 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:28.604085107 +0000 UTC m=+47.209376510 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls") pod "image-registry-85f79f764-d9m27" (UID: "ed18fab9-300d-43bb-9afb-6e4c9b597e56") : secret "image-registry-tls" not found Apr 24 19:07:28.619319 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:28.619245 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:28.619801 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:28.619390 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:07:28.619801 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:28.619416 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85f79f764-d9m27: secret "image-registry-tls" not found Apr 24 19:07:28.619801 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:28.619483 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls podName:ed18fab9-300d-43bb-9afb-6e4c9b597e56 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:32.619461077 +0000 UTC m=+51.224752481 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls") pod "image-registry-85f79f764-d9m27" (UID: "ed18fab9-300d-43bb-9afb-6e4c9b597e56") : secret "image-registry-tls" not found Apr 24 19:07:30.251791 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:30.251749 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-gdkhw" event={"ID":"a54f3e09-4702-4ddf-a7ff-29fce62b2b04","Type":"ContainerStarted","Data":"4c93272d2761f37f21a3eaa18d1b3cf662e250273a0e907b029d6d7fc9bf2f65"} Apr 24 19:07:30.254080 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:30.254053 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r" event={"ID":"58c94ac9-ee49-4ff1-b593-60cf8fc715e9","Type":"ContainerStarted","Data":"6053d86bb289426f98a35a33da25792cfd9601f9b417bb226aaf652ab41a9871"} Apr 24 19:07:30.254867 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:30.254518 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r" Apr 24 19:07:30.256285 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:30.256243 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r" Apr 24 19:07:30.284540 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:30.284487 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-gdkhw" podStartSLOduration=2.172717849 podStartE2EDuration="6.284472645s" podCreationTimestamp="2026-04-24 19:07:24 +0000 UTC" firstStartedPulling="2026-04-24 19:07:25.131412677 +0000 UTC m=+43.736704071" lastFinishedPulling="2026-04-24 19:07:29.243167467 +0000 UTC m=+47.848458867" observedRunningTime="2026-04-24 19:07:30.281614863 +0000 UTC m=+48.886906283" watchObservedRunningTime="2026-04-24 19:07:30.284472645 +0000 UTC m=+48.889764061" Apr 24 19:07:30.635784 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:30.635741 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert\") pod \"ingress-canary-p67rd\" (UID: \"e08e9f5e-5277-4567-a799-97a88665243a\") " pod="openshift-ingress-canary/ingress-canary-p67rd" Apr 24 19:07:30.635989 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:30.635831 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls\") pod \"dns-default-zl4tq\" (UID: \"c680e87e-93f7-42bb-8645-95c2ba5b415e\") " pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:30.635989 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:30.635917 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:30.635989 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:30.635964 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:30.636138 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:30.635991 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert podName:e08e9f5e-5277-4567-a799-97a88665243a nodeName:}" failed. No retries permitted until 2026-04-24 19:07:46.635970422 +0000 UTC m=+65.241261820 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert") pod "ingress-canary-p67rd" (UID: "e08e9f5e-5277-4567-a799-97a88665243a") : secret "canary-serving-cert" not found Apr 24 19:07:30.636138 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:30.636012 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls podName:c680e87e-93f7-42bb-8645-95c2ba5b415e nodeName:}" failed. No retries permitted until 2026-04-24 19:07:46.635997686 +0000 UTC m=+65.241289085 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls") pod "dns-default-zl4tq" (UID: "c680e87e-93f7-42bb-8645-95c2ba5b415e") : secret "dns-default-metrics-tls" not found Apr 24 19:07:32.652297 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:32.652235 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:32.652728 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:32.652387 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:07:32.652728 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:32.652408 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85f79f764-d9m27: secret "image-registry-tls" not found Apr 24 19:07:32.652728 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:32.652467 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls podName:ed18fab9-300d-43bb-9afb-6e4c9b597e56 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:40.65244992 +0000 UTC m=+59.257741318 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls") pod "image-registry-85f79f764-d9m27" (UID: "ed18fab9-300d-43bb-9afb-6e4c9b597e56") : secret "image-registry-tls" not found Apr 24 19:07:40.716207 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:40.716165 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:40.718757 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:40.718732 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls\") pod \"image-registry-85f79f764-d9m27\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:40.767865 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:40.767826 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:40.898596 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:40.898541 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69b9c7976f-lcg4r" podStartSLOduration=12.668752109 podStartE2EDuration="16.89852461s" podCreationTimestamp="2026-04-24 19:07:24 +0000 UTC" firstStartedPulling="2026-04-24 19:07:25.030146653 +0000 UTC m=+43.635438058" lastFinishedPulling="2026-04-24 19:07:29.25991916 +0000 UTC m=+47.865210559" observedRunningTime="2026-04-24 19:07:30.315692168 +0000 UTC m=+48.920983583" watchObservedRunningTime="2026-04-24 19:07:40.89852461 +0000 UTC m=+59.503816026" Apr 24 19:07:40.899314 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:40.899294 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-85f79f764-d9m27"] Apr 24 19:07:40.902361 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:07:40.902333 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded18fab9_300d_43bb_9afb_6e4c9b597e56.slice/crio-6c4ed1f6b307524aa279a79b77a0b01995d337a69dc17dc3d6911eeae783ec3c WatchSource:0}: Error finding container 6c4ed1f6b307524aa279a79b77a0b01995d337a69dc17dc3d6911eeae783ec3c: Status 404 returned error can't find the container with id 6c4ed1f6b307524aa279a79b77a0b01995d337a69dc17dc3d6911eeae783ec3c Apr 24 19:07:41.198289 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:41.198237 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b2ftq" Apr 24 19:07:41.279125 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:41.279079 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85f79f764-d9m27" event={"ID":"ed18fab9-300d-43bb-9afb-6e4c9b597e56","Type":"ContainerStarted","Data":"4b34c594213e453bb97676790d1a3b353e5602be6773b8a0edf4d31bc7723f80"} Apr 24 19:07:41.279125 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:41.279120 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85f79f764-d9m27" event={"ID":"ed18fab9-300d-43bb-9afb-6e4c9b597e56","Type":"ContainerStarted","Data":"6c4ed1f6b307524aa279a79b77a0b01995d337a69dc17dc3d6911eeae783ec3c"} Apr 24 19:07:41.279392 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:41.279215 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:07:41.303318 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:41.303241 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-85f79f764-d9m27" podStartSLOduration=17.30322694 podStartE2EDuration="17.30322694s" podCreationTimestamp="2026-04-24 19:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:07:41.302334064 +0000 UTC m=+59.907625484" watchObservedRunningTime="2026-04-24 19:07:41.30322694 +0000 UTC m=+59.908518396" Apr 24 19:07:46.664380 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:46.664338 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls\") pod \"dns-default-zl4tq\" (UID: \"c680e87e-93f7-42bb-8645-95c2ba5b415e\") " pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:46.664945 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:46.664395 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert\") pod \"ingress-canary-p67rd\" (UID: \"e08e9f5e-5277-4567-a799-97a88665243a\") " pod="openshift-ingress-canary/ingress-canary-p67rd" Apr 24 19:07:46.667020 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:46.666982 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c680e87e-93f7-42bb-8645-95c2ba5b415e-metrics-tls\") pod \"dns-default-zl4tq\" (UID: \"c680e87e-93f7-42bb-8645-95c2ba5b415e\") " pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:46.667129 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:46.667035 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e08e9f5e-5277-4567-a799-97a88665243a-cert\") pod \"ingress-canary-p67rd\" (UID: \"e08e9f5e-5277-4567-a799-97a88665243a\") " pod="openshift-ingress-canary/ingress-canary-p67rd" Apr 24 19:07:46.911214 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:46.911183 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rh56f\"" Apr 24 19:07:46.919503 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:46.919425 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:46.920738 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:46.920717 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l8pmm\"" Apr 24 19:07:46.928896 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:46.928867 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p67rd" Apr 24 19:07:47.059034 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.059004 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zl4tq"] Apr 24 19:07:47.062108 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:07:47.062079 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc680e87e_93f7_42bb_8645_95c2ba5b415e.slice/crio-fee049fd675132292abc106ca3f8aa143f4621355aae2ff10eac7f6b07296a24 WatchSource:0}: Error finding container fee049fd675132292abc106ca3f8aa143f4621355aae2ff10eac7f6b07296a24: Status 404 returned error can't find the container with id fee049fd675132292abc106ca3f8aa143f4621355aae2ff10eac7f6b07296a24 Apr 24 19:07:47.080887 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.080858 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p67rd"] Apr 24 19:07:47.083640 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:07:47.083617 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode08e9f5e_5277_4567_a799_97a88665243a.slice/crio-246ba2e71a35cff47909e1e783e65362586c3a34c0f4105261d63c025b501998 WatchSource:0}: Error finding container 246ba2e71a35cff47909e1e783e65362586c3a34c0f4105261d63c025b501998: Status 404 returned error can't find the container with id 246ba2e71a35cff47909e1e783e65362586c3a34c0f4105261d63c025b501998 Apr 24 19:07:47.295026 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.294984 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p67rd" event={"ID":"e08e9f5e-5277-4567-a799-97a88665243a","Type":"ContainerStarted","Data":"246ba2e71a35cff47909e1e783e65362586c3a34c0f4105261d63c025b501998"} Apr 24 19:07:47.296008 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.295986 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zl4tq" event={"ID":"c680e87e-93f7-42bb-8645-95c2ba5b415e","Type":"ContainerStarted","Data":"fee049fd675132292abc106ca3f8aa143f4621355aae2ff10eac7f6b07296a24"} Apr 24 19:07:47.407345 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.407310 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-85f79f764-d9m27"] Apr 24 19:07:47.490991 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.490957 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-887cb6fd5-fs5mk"] Apr 24 19:07:47.494409 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.494379 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.523501 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.523468 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-887cb6fd5-fs5mk"] Apr 24 19:07:47.537969 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.537934 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6vdfs"] Apr 24 19:07:47.541321 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.541293 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6vdfs" Apr 24 19:07:47.544824 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.544792 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 19:07:47.544973 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.544860 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bjv5f\"" Apr 24 19:07:47.544973 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.544868 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 19:07:47.545087 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.544806 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 19:07:47.545087 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.544995 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 19:07:47.555652 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.555606 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6vdfs"] Apr 24 19:07:47.671693 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.671652 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/635da0a1-4f63-4344-b2a2-310bcbfbe50c-registry-certificates\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.672111 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.671700 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/635da0a1-4f63-4344-b2a2-310bcbfbe50c-installation-pull-secrets\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.672111 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.671742 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-966nx\" (UniqueName: \"kubernetes.io/projected/635da0a1-4f63-4344-b2a2-310bcbfbe50c-kube-api-access-966nx\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.672111 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.671775 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/635da0a1-4f63-4344-b2a2-310bcbfbe50c-registry-tls\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.672111 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.671817 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs\") pod \"network-metrics-daemon-f6x9g\" (UID: \"c0ea34e5-a89a-4142-83d4-e94ef986bfa4\") " pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:07:47.672111 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.671845 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/635da0a1-4f63-4344-b2a2-310bcbfbe50c-trusted-ca\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.672111 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.671876 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e5839401-04b4-44b8-a2ca-b823d81b3ac6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6vdfs\" (UID: \"e5839401-04b4-44b8-a2ca-b823d81b3ac6\") " pod="openshift-insights/insights-runtime-extractor-6vdfs" Apr 24 19:07:47.672111 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.671909 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/635da0a1-4f63-4344-b2a2-310bcbfbe50c-ca-trust-extracted\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.672111 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.671934 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e5839401-04b4-44b8-a2ca-b823d81b3ac6-data-volume\") pod \"insights-runtime-extractor-6vdfs\" (UID: \"e5839401-04b4-44b8-a2ca-b823d81b3ac6\") " pod="openshift-insights/insights-runtime-extractor-6vdfs" Apr 24 19:07:47.672111 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.671976 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/635da0a1-4f63-4344-b2a2-310bcbfbe50c-image-registry-private-configuration\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.672111 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.672004 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgvbl\" (UniqueName: \"kubernetes.io/projected/e5839401-04b4-44b8-a2ca-b823d81b3ac6-kube-api-access-lgvbl\") pod \"insights-runtime-extractor-6vdfs\" (UID: \"e5839401-04b4-44b8-a2ca-b823d81b3ac6\") " pod="openshift-insights/insights-runtime-extractor-6vdfs" Apr 24 19:07:47.672111 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.672030 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e5839401-04b4-44b8-a2ca-b823d81b3ac6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6vdfs\" (UID: \"e5839401-04b4-44b8-a2ca-b823d81b3ac6\") " pod="openshift-insights/insights-runtime-extractor-6vdfs" Apr 24 19:07:47.672111 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.672062 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e5839401-04b4-44b8-a2ca-b823d81b3ac6-crio-socket\") pod \"insights-runtime-extractor-6vdfs\" (UID: \"e5839401-04b4-44b8-a2ca-b823d81b3ac6\") " pod="openshift-insights/insights-runtime-extractor-6vdfs" Apr 24 19:07:47.672111 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.672092 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/635da0a1-4f63-4344-b2a2-310bcbfbe50c-bound-sa-token\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.675400 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.675344 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ea34e5-a89a-4142-83d4-e94ef986bfa4-metrics-certs\") pod \"network-metrics-daemon-f6x9g\" (UID: \"c0ea34e5-a89a-4142-83d4-e94ef986bfa4\") " pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:07:47.735411 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.735378 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tjm4v\"" Apr 24 19:07:47.743811 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.743776 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f6x9g" Apr 24 19:07:47.773367 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.772466 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/635da0a1-4f63-4344-b2a2-310bcbfbe50c-registry-tls\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.773367 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.772515 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/635da0a1-4f63-4344-b2a2-310bcbfbe50c-trusted-ca\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.773367 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.772549 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e5839401-04b4-44b8-a2ca-b823d81b3ac6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6vdfs\" (UID: \"e5839401-04b4-44b8-a2ca-b823d81b3ac6\") " pod="openshift-insights/insights-runtime-extractor-6vdfs" Apr 24 19:07:47.773367 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.772594 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/635da0a1-4f63-4344-b2a2-310bcbfbe50c-ca-trust-extracted\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.773367 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.772620 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e5839401-04b4-44b8-a2ca-b823d81b3ac6-data-volume\") pod \"insights-runtime-extractor-6vdfs\" (UID: \"e5839401-04b4-44b8-a2ca-b823d81b3ac6\") " pod="openshift-insights/insights-runtime-extractor-6vdfs" Apr 24 19:07:47.773367 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.772657 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/635da0a1-4f63-4344-b2a2-310bcbfbe50c-image-registry-private-configuration\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.773367 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.772683 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgvbl\" (UniqueName: \"kubernetes.io/projected/e5839401-04b4-44b8-a2ca-b823d81b3ac6-kube-api-access-lgvbl\") pod \"insights-runtime-extractor-6vdfs\" (UID: \"e5839401-04b4-44b8-a2ca-b823d81b3ac6\") " pod="openshift-insights/insights-runtime-extractor-6vdfs" Apr 24 19:07:47.773367 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.772708 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e5839401-04b4-44b8-a2ca-b823d81b3ac6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6vdfs\" (UID: \"e5839401-04b4-44b8-a2ca-b823d81b3ac6\") " pod="openshift-insights/insights-runtime-extractor-6vdfs" Apr 24 19:07:47.773367 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.772739 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e5839401-04b4-44b8-a2ca-b823d81b3ac6-crio-socket\") pod \"insights-runtime-extractor-6vdfs\" (UID: \"e5839401-04b4-44b8-a2ca-b823d81b3ac6\") " pod="openshift-insights/insights-runtime-extractor-6vdfs" Apr 24 19:07:47.773367 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.772768 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/635da0a1-4f63-4344-b2a2-310bcbfbe50c-bound-sa-token\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.773367 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.772811 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/635da0a1-4f63-4344-b2a2-310bcbfbe50c-registry-certificates\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.773367 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.772837 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/635da0a1-4f63-4344-b2a2-310bcbfbe50c-installation-pull-secrets\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.773367 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.772876 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-966nx\" (UniqueName: \"kubernetes.io/projected/635da0a1-4f63-4344-b2a2-310bcbfbe50c-kube-api-access-966nx\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.774090 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.773973 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/635da0a1-4f63-4344-b2a2-310bcbfbe50c-trusted-ca\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.775054 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.774290 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e5839401-04b4-44b8-a2ca-b823d81b3ac6-crio-socket\") pod \"insights-runtime-extractor-6vdfs\" (UID: \"e5839401-04b4-44b8-a2ca-b823d81b3ac6\") " pod="openshift-insights/insights-runtime-extractor-6vdfs" Apr 24 19:07:47.775054 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.774486 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e5839401-04b4-44b8-a2ca-b823d81b3ac6-data-volume\") pod \"insights-runtime-extractor-6vdfs\" (UID: \"e5839401-04b4-44b8-a2ca-b823d81b3ac6\") " pod="openshift-insights/insights-runtime-extractor-6vdfs" Apr 24 19:07:47.775054 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.774732 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e5839401-04b4-44b8-a2ca-b823d81b3ac6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6vdfs\" (UID: \"e5839401-04b4-44b8-a2ca-b823d81b3ac6\") " pod="openshift-insights/insights-runtime-extractor-6vdfs" Apr 24 19:07:47.775054 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.774793 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/635da0a1-4f63-4344-b2a2-310bcbfbe50c-ca-trust-extracted\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.775788 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.775762 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/635da0a1-4f63-4344-b2a2-310bcbfbe50c-registry-certificates\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.776197 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.776173 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/635da0a1-4f63-4344-b2a2-310bcbfbe50c-registry-tls\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.776415 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.776392 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e5839401-04b4-44b8-a2ca-b823d81b3ac6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6vdfs\" (UID: \"e5839401-04b4-44b8-a2ca-b823d81b3ac6\") " pod="openshift-insights/insights-runtime-extractor-6vdfs" Apr 24 19:07:47.777632 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.777599 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/635da0a1-4f63-4344-b2a2-310bcbfbe50c-installation-pull-secrets\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.778497 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.778477 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/635da0a1-4f63-4344-b2a2-310bcbfbe50c-image-registry-private-configuration\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.788477 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.788413 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-966nx\" (UniqueName: \"kubernetes.io/projected/635da0a1-4f63-4344-b2a2-310bcbfbe50c-kube-api-access-966nx\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.792638 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.792592 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/635da0a1-4f63-4344-b2a2-310bcbfbe50c-bound-sa-token\") pod \"image-registry-887cb6fd5-fs5mk\" (UID: \"635da0a1-4f63-4344-b2a2-310bcbfbe50c\") " pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.793100 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.793080 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgvbl\" (UniqueName: \"kubernetes.io/projected/e5839401-04b4-44b8-a2ca-b823d81b3ac6-kube-api-access-lgvbl\") pod \"insights-runtime-extractor-6vdfs\" (UID: \"e5839401-04b4-44b8-a2ca-b823d81b3ac6\") " pod="openshift-insights/insights-runtime-extractor-6vdfs" Apr 24 19:07:47.806217 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.805770 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:47.853467 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.853270 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6vdfs" Apr 24 19:07:47.910674 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.910621 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f6x9g"] Apr 24 19:07:47.917845 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:07:47.917787 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0ea34e5_a89a_4142_83d4_e94ef986bfa4.slice/crio-3f7cb14e1bae0a1aeeb855f9a255ccc24c6af85f27a968dbedadd4b9910a7fa7 WatchSource:0}: Error finding container 3f7cb14e1bae0a1aeeb855f9a255ccc24c6af85f27a968dbedadd4b9910a7fa7: Status 404 returned error can't find the container with id 3f7cb14e1bae0a1aeeb855f9a255ccc24c6af85f27a968dbedadd4b9910a7fa7 Apr 24 19:07:47.992314 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:47.992268 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-887cb6fd5-fs5mk"] Apr 24 19:07:48.043667 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:48.043635 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6vdfs"] Apr 24 19:07:48.300958 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:48.300911 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6vdfs" event={"ID":"e5839401-04b4-44b8-a2ca-b823d81b3ac6","Type":"ContainerStarted","Data":"af58997c217fc24f910bdd44fed2ebf62fa9838d3aa50b002b5125859bd8cbcd"} Apr 24 19:07:48.300958 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:48.300961 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6vdfs" event={"ID":"e5839401-04b4-44b8-a2ca-b823d81b3ac6","Type":"ContainerStarted","Data":"bd4b9511a2cbc891379b6e22da09d6de1323376e64f668e7d6ddb009fa7b1781"} Apr 24 19:07:48.302325 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:48.302288 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f6x9g" event={"ID":"c0ea34e5-a89a-4142-83d4-e94ef986bfa4","Type":"ContainerStarted","Data":"3f7cb14e1bae0a1aeeb855f9a255ccc24c6af85f27a968dbedadd4b9910a7fa7"} Apr 24 19:07:48.304193 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:48.304131 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" event={"ID":"635da0a1-4f63-4344-b2a2-310bcbfbe50c","Type":"ContainerStarted","Data":"8397fed4cbf7fc8b5044a146b359c5b82c8595ad7084a5df9a84968e1e4a98a9"} Apr 24 19:07:48.304447 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:48.304379 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" event={"ID":"635da0a1-4f63-4344-b2a2-310bcbfbe50c","Type":"ContainerStarted","Data":"cac84e1218b4753496492e5ac6dd865ce4b69a42182d78e32e442acdb5da48c4"} Apr 24 19:07:48.304447 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:48.304427 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:07:48.332679 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:48.332612 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" podStartSLOduration=1.332590977 podStartE2EDuration="1.332590977s" podCreationTimestamp="2026-04-24 19:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:07:48.330641934 +0000 UTC m=+66.935933349" watchObservedRunningTime="2026-04-24 19:07:48.332590977 +0000 UTC m=+66.937882396" Apr 24 19:07:50.313400 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:50.313365 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p67rd" event={"ID":"e08e9f5e-5277-4567-a799-97a88665243a","Type":"ContainerStarted","Data":"b9eff771cbdec858322771db3b98b46642b3d6df050a27b685813236f7c78117"} Apr 24 19:07:50.314781 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:50.314755 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zl4tq" event={"ID":"c680e87e-93f7-42bb-8645-95c2ba5b415e","Type":"ContainerStarted","Data":"222093f4f337d47a49698d46a7b44748eb322acb5fbcfa2cdd87ff0a6732dcbb"} Apr 24 19:07:50.330856 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:50.330810 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p67rd" podStartSLOduration=33.533393148 podStartE2EDuration="36.330789529s" podCreationTimestamp="2026-04-24 19:07:14 +0000 UTC" firstStartedPulling="2026-04-24 19:07:47.085498936 +0000 UTC m=+65.690790329" lastFinishedPulling="2026-04-24 19:07:49.882895316 +0000 UTC m=+68.488186710" observedRunningTime="2026-04-24 19:07:50.330052921 +0000 UTC m=+68.935344338" watchObservedRunningTime="2026-04-24 19:07:50.330789529 +0000 UTC m=+68.936080929" Apr 24 19:07:51.230418 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:51.230384 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-n6v84" Apr 24 19:07:51.319912 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:51.319857 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6vdfs" event={"ID":"e5839401-04b4-44b8-a2ca-b823d81b3ac6","Type":"ContainerStarted","Data":"1ccb5d20dc12c2dbaaadef7389941aa8d0bba854e6fa9f8e170ce3bac04d8985"} Apr 24 19:07:51.321609 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:51.321569 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f6x9g" event={"ID":"c0ea34e5-a89a-4142-83d4-e94ef986bfa4","Type":"ContainerStarted","Data":"7718e1ee1b1a621c6e35b6c3bbb50c1e877e399e66dec8c5d0978d3381652865"} Apr 24 19:07:51.321609 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:51.321604 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f6x9g" event={"ID":"c0ea34e5-a89a-4142-83d4-e94ef986bfa4","Type":"ContainerStarted","Data":"56ca76e461ce9b6a23ec400df1f042eab4010439646b84125674d275fb0ff450"} Apr 24 19:07:51.323316 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:51.323280 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zl4tq" event={"ID":"c680e87e-93f7-42bb-8645-95c2ba5b415e","Type":"ContainerStarted","Data":"9e470428c392e6eae3980ca85ea52b4e0235dd5a7045f6d0cec7c4f2f6da595e"} Apr 24 19:07:51.343104 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:51.343048 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-f6x9g" podStartSLOduration=66.907445919 podStartE2EDuration="1m9.343031491s" podCreationTimestamp="2026-04-24 19:06:42 +0000 UTC" firstStartedPulling="2026-04-24 19:07:47.922110101 +0000 UTC m=+66.527401511" lastFinishedPulling="2026-04-24 19:07:50.357695685 +0000 UTC m=+68.962987083" observedRunningTime="2026-04-24 19:07:51.340743302 +0000 UTC m=+69.946034721" watchObservedRunningTime="2026-04-24 19:07:51.343031491 +0000 UTC m=+69.948322923" Apr 24 19:07:52.041510 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:52.041451 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zl4tq" podStartSLOduration=35.222312968 podStartE2EDuration="38.041430235s" podCreationTimestamp="2026-04-24 19:07:14 +0000 UTC" firstStartedPulling="2026-04-24 19:07:47.063780443 +0000 UTC m=+65.669071841" lastFinishedPulling="2026-04-24 19:07:49.882897713 +0000 UTC m=+68.488189108" observedRunningTime="2026-04-24 19:07:51.375509123 +0000 UTC m=+69.980800540" watchObservedRunningTime="2026-04-24 19:07:52.041430235 +0000 UTC m=+70.646721654" Apr 24 19:07:52.326508 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:52.326415 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-zl4tq" Apr 24 19:07:53.331185 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:53.331148 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6vdfs" event={"ID":"e5839401-04b4-44b8-a2ca-b823d81b3ac6","Type":"ContainerStarted","Data":"6b8a26feac8c1e57d871323ccf52037b51e5dd2dcb60f6e4de3f0db122171ee1"} Apr 24 19:07:53.353913 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:53.353860 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6vdfs" podStartSLOduration=1.894210326 podStartE2EDuration="6.353844392s" podCreationTimestamp="2026-04-24 19:07:47 +0000 UTC" firstStartedPulling="2026-04-24 19:07:48.132378298 +0000 UTC m=+66.737669717" lastFinishedPulling="2026-04-24 19:07:52.592012389 +0000 UTC m=+71.197303783" observedRunningTime="2026-04-24 19:07:53.351827294 +0000 UTC m=+71.957118722" watchObservedRunningTime="2026-04-24 19:07:53.353844392 +0000 UTC m=+71.959135808" Apr 24 19:07:57.274202 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.274155 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-n2p7v"] Apr 24 19:07:57.278181 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.278158 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.281699 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.281070 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 19:07:57.281699 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.281507 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-8mw4l\"" Apr 24 19:07:57.283070 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.282435 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 19:07:57.283070 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.282732 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 19:07:57.283070 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.282934 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 19:07:57.284084 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.283934 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 19:07:57.294039 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.294019 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 19:07:57.296762 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.296742 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-n2p7v"] Apr 24 19:07:57.303681 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.303659 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-cmsk6"] Apr 24 19:07:57.306625 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.306606 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.309226 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.309179 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 19:07:57.309345 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.309226 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2nlmz\"" Apr 24 19:07:57.309410 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.309375 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 19:07:57.309462 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.309410 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 19:07:57.342016 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.341979 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c2485ee1-2466-4a88-b136-103a66abef2f-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.342210 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.342031 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/efda0d48-2fab-4250-bd44-8d6f6bc536e2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.342210 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.342081 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/efda0d48-2fab-4250-bd44-8d6f6bc536e2-root\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.342210 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.342116 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/efda0d48-2fab-4250-bd44-8d6f6bc536e2-node-exporter-tls\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.342210 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.342144 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2485ee1-2466-4a88-b136-103a66abef2f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.342210 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.342170 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cnmp\" (UniqueName: \"kubernetes.io/projected/c2485ee1-2466-4a88-b136-103a66abef2f-kube-api-access-9cnmp\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.342210 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.342197 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/efda0d48-2fab-4250-bd44-8d6f6bc536e2-sys\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.342578 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.342245 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c2485ee1-2466-4a88-b136-103a66abef2f-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.342578 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.342303 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/efda0d48-2fab-4250-bd44-8d6f6bc536e2-node-exporter-wtmp\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.342578 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.342327 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c2485ee1-2466-4a88-b136-103a66abef2f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.342578 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.342354 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/efda0d48-2fab-4250-bd44-8d6f6bc536e2-node-exporter-textfile\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.342578 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.342395 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c2485ee1-2466-4a88-b136-103a66abef2f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.342578 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.342466 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/efda0d48-2fab-4250-bd44-8d6f6bc536e2-node-exporter-accelerators-collector-config\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.342578 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.342521 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljfs2\" (UniqueName: \"kubernetes.io/projected/efda0d48-2fab-4250-bd44-8d6f6bc536e2-kube-api-access-ljfs2\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.342578 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.342550 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/efda0d48-2fab-4250-bd44-8d6f6bc536e2-metrics-client-ca\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.414052 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.414009 2583 patch_prober.go:28] interesting pod/image-registry-85f79f764-d9m27 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 19:07:57.414227 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.414064 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-85f79f764-d9m27" podUID="ed18fab9-300d-43bb-9afb-6e4c9b597e56" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:07:57.443321 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.443287 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c2485ee1-2466-4a88-b136-103a66abef2f-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.443321 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.443333 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/efda0d48-2fab-4250-bd44-8d6f6bc536e2-node-exporter-wtmp\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.443567 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.443360 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c2485ee1-2466-4a88-b136-103a66abef2f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.443567 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.443383 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/efda0d48-2fab-4250-bd44-8d6f6bc536e2-node-exporter-textfile\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.443667 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.443577 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/efda0d48-2fab-4250-bd44-8d6f6bc536e2-node-exporter-wtmp\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.443667 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.443632 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c2485ee1-2466-4a88-b136-103a66abef2f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.443767 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.443672 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/efda0d48-2fab-4250-bd44-8d6f6bc536e2-node-exporter-accelerators-collector-config\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.443767 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.443704 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljfs2\" (UniqueName: \"kubernetes.io/projected/efda0d48-2fab-4250-bd44-8d6f6bc536e2-kube-api-access-ljfs2\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.443767 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.443740 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/efda0d48-2fab-4250-bd44-8d6f6bc536e2-metrics-client-ca\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.443767 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.443748 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/efda0d48-2fab-4250-bd44-8d6f6bc536e2-node-exporter-textfile\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.443961 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.443829 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c2485ee1-2466-4a88-b136-103a66abef2f-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.443961 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.443873 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/efda0d48-2fab-4250-bd44-8d6f6bc536e2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.443961 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.443901 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/efda0d48-2fab-4250-bd44-8d6f6bc536e2-root\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.443961 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.443928 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/efda0d48-2fab-4250-bd44-8d6f6bc536e2-node-exporter-tls\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.444147 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.443961 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2485ee1-2466-4a88-b136-103a66abef2f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.444147 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.443988 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cnmp\" (UniqueName: \"kubernetes.io/projected/c2485ee1-2466-4a88-b136-103a66abef2f-kube-api-access-9cnmp\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.444147 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.444019 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/efda0d48-2fab-4250-bd44-8d6f6bc536e2-sys\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.444147 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.444103 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/efda0d48-2fab-4250-bd44-8d6f6bc536e2-sys\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.444147 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.444133 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c2485ee1-2466-4a88-b136-103a66abef2f-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.444429 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.444199 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/efda0d48-2fab-4250-bd44-8d6f6bc536e2-root\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.444429 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.444234 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/efda0d48-2fab-4250-bd44-8d6f6bc536e2-metrics-client-ca\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.444537 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.444433 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c2485ee1-2466-4a88-b136-103a66abef2f-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.445320 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:57.444601 2583 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 24 19:07:57.445320 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:07:57.444673 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2485ee1-2466-4a88-b136-103a66abef2f-kube-state-metrics-tls podName:c2485ee1-2466-4a88-b136-103a66abef2f nodeName:}" failed. No retries permitted until 2026-04-24 19:07:57.944654931 +0000 UTC m=+76.549946331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/c2485ee1-2466-4a88-b136-103a66abef2f-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-n2p7v" (UID: "c2485ee1-2466-4a88-b136-103a66abef2f") : secret "kube-state-metrics-tls" not found Apr 24 19:07:57.445320 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.444728 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c2485ee1-2466-4a88-b136-103a66abef2f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.445320 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.445239 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/efda0d48-2fab-4250-bd44-8d6f6bc536e2-node-exporter-accelerators-collector-config\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.447043 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.447010 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/efda0d48-2fab-4250-bd44-8d6f6bc536e2-node-exporter-tls\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.447486 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.447467 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c2485ee1-2466-4a88-b136-103a66abef2f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.457424 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.457396 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/efda0d48-2fab-4250-bd44-8d6f6bc536e2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.465973 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.460233 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cnmp\" (UniqueName: \"kubernetes.io/projected/c2485ee1-2466-4a88-b136-103a66abef2f-kube-api-access-9cnmp\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.465973 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.463703 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljfs2\" (UniqueName: \"kubernetes.io/projected/efda0d48-2fab-4250-bd44-8d6f6bc536e2-kube-api-access-ljfs2\") pod \"node-exporter-cmsk6\" (UID: \"efda0d48-2fab-4250-bd44-8d6f6bc536e2\") " pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.619403 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.619306 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cmsk6" Apr 24 19:07:57.629632 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:07:57.629603 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefda0d48_2fab_4250_bd44_8d6f6bc536e2.slice/crio-6f6e7cf245f4423331d7de78f3b01e78140caf3680394d0e499ef46145a217b4 WatchSource:0}: Error finding container 6f6e7cf245f4423331d7de78f3b01e78140caf3680394d0e499ef46145a217b4: Status 404 returned error can't find the container with id 6f6e7cf245f4423331d7de78f3b01e78140caf3680394d0e499ef46145a217b4 Apr 24 19:07:57.947310 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.947194 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2485ee1-2466-4a88-b136-103a66abef2f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:57.949731 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:57.949703 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c2485ee1-2466-4a88-b136-103a66abef2f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-n2p7v\" (UID: \"c2485ee1-2466-4a88-b136-103a66abef2f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:58.194636 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:58.194604 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" Apr 24 19:07:58.346666 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:58.346614 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cmsk6" event={"ID":"efda0d48-2fab-4250-bd44-8d6f6bc536e2","Type":"ContainerStarted","Data":"6f6e7cf245f4423331d7de78f3b01e78140caf3680394d0e499ef46145a217b4"} Apr 24 19:07:58.467216 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:58.467193 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-n2p7v"] Apr 24 19:07:58.469746 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:07:58.469719 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2485ee1_2466_4a88_b136_103a66abef2f.slice/crio-65913566ec2bbd136db1921c981c92434a56240006555afd45eb162ea730cc90 WatchSource:0}: Error finding container 65913566ec2bbd136db1921c981c92434a56240006555afd45eb162ea730cc90: Status 404 returned error can't find the container with id 65913566ec2bbd136db1921c981c92434a56240006555afd45eb162ea730cc90 Apr 24 19:07:59.351482 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:59.351384 2583 generic.go:358] "Generic (PLEG): container finished" podID="efda0d48-2fab-4250-bd44-8d6f6bc536e2" containerID="5c4feb0fcc1f36bf3e953d254ecf138157d3f81c7903c31e2f9707d49e154300" exitCode=0 Apr 24 19:07:59.351942 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:59.351484 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cmsk6" event={"ID":"efda0d48-2fab-4250-bd44-8d6f6bc536e2","Type":"ContainerDied","Data":"5c4feb0fcc1f36bf3e953d254ecf138157d3f81c7903c31e2f9707d49e154300"} Apr 24 19:07:59.352811 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:07:59.352777 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" event={"ID":"c2485ee1-2466-4a88-b136-103a66abef2f","Type":"ContainerStarted","Data":"65913566ec2bbd136db1921c981c92434a56240006555afd45eb162ea730cc90"} Apr 24 19:08:00.358231 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.358194 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cmsk6" event={"ID":"efda0d48-2fab-4250-bd44-8d6f6bc536e2","Type":"ContainerStarted","Data":"b6d7619338f8e4a2e20e2e0902710730ae44466ddc106cb76de060f7f85ec913"} Apr 24 19:08:00.358747 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.358239 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cmsk6" event={"ID":"efda0d48-2fab-4250-bd44-8d6f6bc536e2","Type":"ContainerStarted","Data":"75498334c8d1eefd9db90655161e8f25f202bf97d5aa9d0c84090f672093b062"} Apr 24 19:08:00.360017 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.359991 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" event={"ID":"c2485ee1-2466-4a88-b136-103a66abef2f","Type":"ContainerStarted","Data":"c3feaa0bcf271172d6e77f32a57de8eaca2ee17d16c8a9c98c4daa1ed7f5319b"} Apr 24 19:08:00.360132 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.360022 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" event={"ID":"c2485ee1-2466-4a88-b136-103a66abef2f","Type":"ContainerStarted","Data":"188df270d8e8b1516ba3c92ab09414660ea3f3073bf21e35b130cb765866e658"} Apr 24 19:08:00.391588 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.391532 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-cmsk6" podStartSLOduration=2.657371047 podStartE2EDuration="3.391514255s" podCreationTimestamp="2026-04-24 19:07:57 +0000 UTC" firstStartedPulling="2026-04-24 19:07:57.632058746 +0000 UTC m=+76.237350140" lastFinishedPulling="2026-04-24 19:07:58.366201953 +0000 UTC m=+76.971493348" observedRunningTime="2026-04-24 19:08:00.389768796 +0000 UTC m=+78.995060212" watchObservedRunningTime="2026-04-24 19:08:00.391514255 +0000 UTC m=+78.996805664" Apr 24 19:08:00.676670 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.676589 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-ccb4f9c5d-h5qnb"] Apr 24 19:08:00.702849 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.702817 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-ccb4f9c5d-h5qnb"] Apr 24 19:08:00.702992 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.702952 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.705539 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.705516 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-d7g57\"" Apr 24 19:08:00.705669 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.705615 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 19:08:00.705788 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.705772 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 19:08:00.705860 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.705795 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 19:08:00.705918 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.705867 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 19:08:00.705918 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.705885 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 19:08:00.706173 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.706154 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 19:08:00.706542 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.706530 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 19:08:00.710738 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.710722 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 19:08:00.770745 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.770708 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-trusted-ca-bundle\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.770745 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.770745 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5m95\" (UniqueName: \"kubernetes.io/projected/4c668914-260e-4c3f-9f14-f62056bc7b78-kube-api-access-r5m95\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.770979 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.770769 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c668914-260e-4c3f-9f14-f62056bc7b78-console-serving-cert\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.770979 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.770828 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-console-config\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.770979 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.770907 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c668914-260e-4c3f-9f14-f62056bc7b78-console-oauth-config\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.770979 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.770945 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-oauth-serving-cert\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.770979 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.770963 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-service-ca\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.871345 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.871303 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c668914-260e-4c3f-9f14-f62056bc7b78-console-serving-cert\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.871534 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.871359 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-console-config\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.871534 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.871424 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c668914-260e-4c3f-9f14-f62056bc7b78-console-oauth-config\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.871534 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.871446 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-oauth-serving-cert\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.871534 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.871470 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-service-ca\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.871534 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.871511 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-trusted-ca-bundle\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.871776 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.871534 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5m95\" (UniqueName: \"kubernetes.io/projected/4c668914-260e-4c3f-9f14-f62056bc7b78-kube-api-access-r5m95\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.872152 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.872126 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-console-config\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.872207 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.872178 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-service-ca\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.872207 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.872184 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-oauth-serving-cert\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.872352 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.872336 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-trusted-ca-bundle\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.881576 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.881540 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5m95\" (UniqueName: \"kubernetes.io/projected/4c668914-260e-4c3f-9f14-f62056bc7b78-kube-api-access-r5m95\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.883458 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.883440 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c668914-260e-4c3f-9f14-f62056bc7b78-console-oauth-config\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:00.883710 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:00.883686 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c668914-260e-4c3f-9f14-f62056bc7b78-console-serving-cert\") pod \"console-ccb4f9c5d-h5qnb\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:01.013040 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:01.012950 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:01.134202 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:01.134167 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-ccb4f9c5d-h5qnb"] Apr 24 19:08:01.136933 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:08:01.136908 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c668914_260e_4c3f_9f14_f62056bc7b78.slice/crio-468467c277d8b344a3f3c3bc9722ebe924f5681689ed155cab314cfab946dc97 WatchSource:0}: Error finding container 468467c277d8b344a3f3c3bc9722ebe924f5681689ed155cab314cfab946dc97: Status 404 returned error can't find the container with id 468467c277d8b344a3f3c3bc9722ebe924f5681689ed155cab314cfab946dc97 Apr 24 19:08:01.365167 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:01.365126 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" event={"ID":"c2485ee1-2466-4a88-b136-103a66abef2f","Type":"ContainerStarted","Data":"f4a1ff4511dd1bd24479e8a81564083edc72b1a6e33e1892c31d36e5b1786d39"} Apr 24 19:08:01.366276 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:01.366230 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ccb4f9c5d-h5qnb" event={"ID":"4c668914-260e-4c3f-9f14-f62056bc7b78","Type":"ContainerStarted","Data":"468467c277d8b344a3f3c3bc9722ebe924f5681689ed155cab314cfab946dc97"} Apr 24 19:08:01.389275 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:01.389202 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-n2p7v" podStartSLOduration=2.853358833 podStartE2EDuration="4.38918601s" podCreationTimestamp="2026-04-24 19:07:57 +0000 UTC" firstStartedPulling="2026-04-24 19:07:58.471907128 +0000 UTC m=+77.077198536" lastFinishedPulling="2026-04-24 19:08:00.007734319 +0000 UTC m=+78.613025713" observedRunningTime="2026-04-24 19:08:01.387451307 +0000 UTC m=+79.992742724" watchObservedRunningTime="2026-04-24 19:08:01.38918601 +0000 UTC m=+79.994477426" Apr 24 19:08:02.333684 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:02.333653 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zl4tq" Apr 24 19:08:03.466165 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.466126 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 19:08:03.470401 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.470368 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.472836 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.472810 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 19:08:03.473796 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.473272 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 19:08:03.473796 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.473427 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 19:08:03.473796 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.473517 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 19:08:03.473796 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.473709 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 19:08:03.474008 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.473962 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 19:08:03.474008 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.473990 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-kxpfh\"" Apr 24 19:08:03.474070 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.474031 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 19:08:03.474102 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.474088 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 19:08:03.474162 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.474042 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 19:08:03.474312 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.474295 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 19:08:03.474415 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.474326 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-91d61ujimh4dk\"" Apr 24 19:08:03.474415 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.474367 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 19:08:03.474623 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.474490 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 19:08:03.477721 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.477701 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 19:08:03.488993 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.488963 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 19:08:03.492921 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.492894 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.493067 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.492931 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.493067 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.492961 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c6563320-5bc9-4398-bfaa-802f73c524a3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.493067 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.493014 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.493067 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.493047 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.493314 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.493075 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.493314 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.493140 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.493314 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.493201 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.493314 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.493228 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.493314 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.493282 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt5n6\" (UniqueName: \"kubernetes.io/projected/c6563320-5bc9-4398-bfaa-802f73c524a3-kube-api-access-bt5n6\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.493314 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.493306 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.493617 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.493339 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.493617 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.493406 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c6563320-5bc9-4398-bfaa-802f73c524a3-config-out\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.493617 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.493445 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.493617 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.493463 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-web-config\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.493617 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.493487 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.493617 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.493518 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-config\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.493617 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.493555 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.594983 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.594935 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.595169 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.595001 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.595269 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.595038 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.595269 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.595237 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.595396 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.595298 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bt5n6\" (UniqueName: \"kubernetes.io/projected/c6563320-5bc9-4398-bfaa-802f73c524a3-kube-api-access-bt5n6\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.595396 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.595329 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.595396 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.595380 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.595553 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.595412 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c6563320-5bc9-4398-bfaa-802f73c524a3-config-out\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.595553 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.595443 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.595553 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.595469 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-web-config\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.595553 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.595505 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.595553 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.595548 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-config\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.595805 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.595570 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.595805 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.595610 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.595805 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.595627 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.595805 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.595643 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c6563320-5bc9-4398-bfaa-802f73c524a3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.595805 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.595671 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.595805 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.595697 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.599199 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.597165 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.599199 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:08:03.597650 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-trusted-ca-bundle podName:c6563320-5bc9-4398-bfaa-802f73c524a3 nodeName:}" failed. No retries permitted until 2026-04-24 19:08:04.097625991 +0000 UTC m=+82.702917400 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3") : configmap references non-existent config key: ca-bundle.crt Apr 24 19:08:03.599199 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.598283 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.599199 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.598450 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.599199 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.598814 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.599199 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.598944 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.601101 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.599883 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.601101 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.600283 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.601101 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.600651 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.601101 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.600841 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.601101 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.601061 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.601354 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.601305 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-config\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.601716 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.601690 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c6563320-5bc9-4398-bfaa-802f73c524a3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.601952 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.601908 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c6563320-5bc9-4398-bfaa-802f73c524a3-config-out\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.602127 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.602102 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.602306 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.602287 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-web-config\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.603397 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.603377 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:03.611523 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:03.611499 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt5n6\" (UniqueName: \"kubernetes.io/projected/c6563320-5bc9-4398-bfaa-802f73c524a3-kube-api-access-bt5n6\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:04.100756 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:04.100727 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:04.101659 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:04.101633 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:04.382294 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:04.382182 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ccb4f9c5d-h5qnb" event={"ID":"4c668914-260e-4c3f-9f14-f62056bc7b78","Type":"ContainerStarted","Data":"ea2feb5211c1f4048292036d30a40f5eecfd3e02199238a6c8bbe7a6d07b94c5"} Apr 24 19:08:04.383594 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:04.383577 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:04.401469 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:04.401411 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-ccb4f9c5d-h5qnb" podStartSLOduration=1.441898999 podStartE2EDuration="4.401396227s" podCreationTimestamp="2026-04-24 19:08:00 +0000 UTC" firstStartedPulling="2026-04-24 19:08:01.139292546 +0000 UTC m=+79.744583939" lastFinishedPulling="2026-04-24 19:08:04.098789755 +0000 UTC m=+82.704081167" observedRunningTime="2026-04-24 19:08:04.399970614 +0000 UTC m=+83.005262031" watchObservedRunningTime="2026-04-24 19:08:04.401396227 +0000 UTC m=+83.006687642" Apr 24 19:08:04.521227 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:04.521166 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 19:08:04.523870 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:08:04.523842 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6563320_5bc9_4398_bfaa_802f73c524a3.slice/crio-4ca0b23df5f9292632a38f11e4190e5fc614325426101c3b2cf8bc449405fe15 WatchSource:0}: Error finding container 4ca0b23df5f9292632a38f11e4190e5fc614325426101c3b2cf8bc449405fe15: Status 404 returned error can't find the container with id 4ca0b23df5f9292632a38f11e4190e5fc614325426101c3b2cf8bc449405fe15 Apr 24 19:08:05.386826 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:05.386794 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c6563320-5bc9-4398-bfaa-802f73c524a3","Type":"ContainerStarted","Data":"4ca0b23df5f9292632a38f11e4190e5fc614325426101c3b2cf8bc449405fe15"} Apr 24 19:08:06.391438 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:06.391400 2583 generic.go:358] "Generic (PLEG): container finished" podID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerID="c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54" exitCode=0 Apr 24 19:08:06.391898 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:06.391448 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c6563320-5bc9-4398-bfaa-802f73c524a3","Type":"ContainerDied","Data":"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54"} Apr 24 19:08:07.412406 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:07.412374 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:08:08.097082 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:08.095565 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-ccb4f9c5d-h5qnb"] Apr 24 19:08:09.311732 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:09.311700 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-887cb6fd5-fs5mk" Apr 24 19:08:09.403776 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:09.403739 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c6563320-5bc9-4398-bfaa-802f73c524a3","Type":"ContainerStarted","Data":"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260"} Apr 24 19:08:09.403776 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:09.403776 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c6563320-5bc9-4398-bfaa-802f73c524a3","Type":"ContainerStarted","Data":"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da"} Apr 24 19:08:11.013964 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:11.013924 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:11.412943 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:11.412912 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c6563320-5bc9-4398-bfaa-802f73c524a3","Type":"ContainerStarted","Data":"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36"} Apr 24 19:08:11.412943 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:11.412945 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c6563320-5bc9-4398-bfaa-802f73c524a3","Type":"ContainerStarted","Data":"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8"} Apr 24 19:08:11.413142 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:11.412955 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c6563320-5bc9-4398-bfaa-802f73c524a3","Type":"ContainerStarted","Data":"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27"} Apr 24 19:08:11.413142 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:11.412964 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c6563320-5bc9-4398-bfaa-802f73c524a3","Type":"ContainerStarted","Data":"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178"} Apr 24 19:08:11.448959 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:11.448908 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.250580902 podStartE2EDuration="8.448890482s" podCreationTimestamp="2026-04-24 19:08:03 +0000 UTC" firstStartedPulling="2026-04-24 19:08:04.525795286 +0000 UTC m=+83.131086683" lastFinishedPulling="2026-04-24 19:08:10.724104866 +0000 UTC m=+89.329396263" observedRunningTime="2026-04-24 19:08:11.448627572 +0000 UTC m=+90.053919026" watchObservedRunningTime="2026-04-24 19:08:11.448890482 +0000 UTC m=+90.054181899" Apr 24 19:08:12.426766 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.426720 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-85f79f764-d9m27" podUID="ed18fab9-300d-43bb-9afb-6e4c9b597e56" containerName="registry" containerID="cri-o://4b34c594213e453bb97676790d1a3b353e5602be6773b8a0edf4d31bc7723f80" gracePeriod=30 Apr 24 19:08:12.660480 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.660456 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:08:12.784085 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.784050 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ed18fab9-300d-43bb-9afb-6e4c9b597e56-installation-pull-secrets\") pod \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " Apr 24 19:08:12.784297 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.784102 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed18fab9-300d-43bb-9afb-6e4c9b597e56-trusted-ca\") pod \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " Apr 24 19:08:12.784297 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.784136 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nntmn\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-kube-api-access-nntmn\") pod \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " Apr 24 19:08:12.784297 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.784159 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ed18fab9-300d-43bb-9afb-6e4c9b597e56-ca-trust-extracted\") pod \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " Apr 24 19:08:12.784297 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.784195 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-certificates\") pod \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " Apr 24 19:08:12.784297 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.784221 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls\") pod \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " Apr 24 19:08:12.784297 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.784278 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-bound-sa-token\") pod \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " Apr 24 19:08:12.784599 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.784345 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ed18fab9-300d-43bb-9afb-6e4c9b597e56-image-registry-private-configuration\") pod \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\" (UID: \"ed18fab9-300d-43bb-9afb-6e4c9b597e56\") " Apr 24 19:08:12.784657 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.784624 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed18fab9-300d-43bb-9afb-6e4c9b597e56-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ed18fab9-300d-43bb-9afb-6e4c9b597e56" (UID: "ed18fab9-300d-43bb-9afb-6e4c9b597e56"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:08:12.785024 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.784994 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ed18fab9-300d-43bb-9afb-6e4c9b597e56" (UID: "ed18fab9-300d-43bb-9afb-6e4c9b597e56"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:08:12.786862 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.786800 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-kube-api-access-nntmn" (OuterVolumeSpecName: "kube-api-access-nntmn") pod "ed18fab9-300d-43bb-9afb-6e4c9b597e56" (UID: "ed18fab9-300d-43bb-9afb-6e4c9b597e56"). InnerVolumeSpecName "kube-api-access-nntmn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:08:12.786962 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.786899 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ed18fab9-300d-43bb-9afb-6e4c9b597e56" (UID: "ed18fab9-300d-43bb-9afb-6e4c9b597e56"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:08:12.787093 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.787070 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed18fab9-300d-43bb-9afb-6e4c9b597e56-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ed18fab9-300d-43bb-9afb-6e4c9b597e56" (UID: "ed18fab9-300d-43bb-9afb-6e4c9b597e56"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:08:12.787195 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.787119 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed18fab9-300d-43bb-9afb-6e4c9b597e56-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "ed18fab9-300d-43bb-9afb-6e4c9b597e56" (UID: "ed18fab9-300d-43bb-9afb-6e4c9b597e56"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:08:12.787195 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.787132 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ed18fab9-300d-43bb-9afb-6e4c9b597e56" (UID: "ed18fab9-300d-43bb-9afb-6e4c9b597e56"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:08:12.793420 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.793393 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed18fab9-300d-43bb-9afb-6e4c9b597e56-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ed18fab9-300d-43bb-9afb-6e4c9b597e56" (UID: "ed18fab9-300d-43bb-9afb-6e4c9b597e56"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:08:12.885764 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.885721 2583 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ed18fab9-300d-43bb-9afb-6e4c9b597e56-image-registry-private-configuration\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:08:12.885764 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.885756 2583 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ed18fab9-300d-43bb-9afb-6e4c9b597e56-installation-pull-secrets\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:08:12.885764 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.885770 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed18fab9-300d-43bb-9afb-6e4c9b597e56-trusted-ca\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:08:12.886000 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.885783 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nntmn\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-kube-api-access-nntmn\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:08:12.886000 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.885797 2583 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ed18fab9-300d-43bb-9afb-6e4c9b597e56-ca-trust-extracted\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:08:12.886000 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.885809 2583 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-certificates\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:08:12.886000 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.885821 2583 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-registry-tls\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:08:12.886000 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:12.885832 2583 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ed18fab9-300d-43bb-9afb-6e4c9b597e56-bound-sa-token\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:08:13.422301 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:13.422245 2583 generic.go:358] "Generic (PLEG): container finished" podID="ed18fab9-300d-43bb-9afb-6e4c9b597e56" containerID="4b34c594213e453bb97676790d1a3b353e5602be6773b8a0edf4d31bc7723f80" exitCode=0 Apr 24 19:08:13.422478 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:13.422338 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85f79f764-d9m27" Apr 24 19:08:13.422478 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:13.422347 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85f79f764-d9m27" event={"ID":"ed18fab9-300d-43bb-9afb-6e4c9b597e56","Type":"ContainerDied","Data":"4b34c594213e453bb97676790d1a3b353e5602be6773b8a0edf4d31bc7723f80"} Apr 24 19:08:13.422478 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:13.422390 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85f79f764-d9m27" event={"ID":"ed18fab9-300d-43bb-9afb-6e4c9b597e56","Type":"ContainerDied","Data":"6c4ed1f6b307524aa279a79b77a0b01995d337a69dc17dc3d6911eeae783ec3c"} Apr 24 19:08:13.422478 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:13.422411 2583 scope.go:117] "RemoveContainer" containerID="4b34c594213e453bb97676790d1a3b353e5602be6773b8a0edf4d31bc7723f80" Apr 24 19:08:13.430977 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:13.430919 2583 scope.go:117] "RemoveContainer" containerID="4b34c594213e453bb97676790d1a3b353e5602be6773b8a0edf4d31bc7723f80" Apr 24 19:08:13.431244 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:08:13.431204 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b34c594213e453bb97676790d1a3b353e5602be6773b8a0edf4d31bc7723f80\": container with ID starting with 4b34c594213e453bb97676790d1a3b353e5602be6773b8a0edf4d31bc7723f80 not found: ID does not exist" containerID="4b34c594213e453bb97676790d1a3b353e5602be6773b8a0edf4d31bc7723f80" Apr 24 19:08:13.431244 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:13.431232 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b34c594213e453bb97676790d1a3b353e5602be6773b8a0edf4d31bc7723f80"} err="failed to get container status \"4b34c594213e453bb97676790d1a3b353e5602be6773b8a0edf4d31bc7723f80\": rpc error: code = NotFound desc = could not find container \"4b34c594213e453bb97676790d1a3b353e5602be6773b8a0edf4d31bc7723f80\": container with ID starting with 4b34c594213e453bb97676790d1a3b353e5602be6773b8a0edf4d31bc7723f80 not found: ID does not exist" Apr 24 19:08:13.444020 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:13.443992 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-85f79f764-d9m27"] Apr 24 19:08:13.450298 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:13.450274 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-85f79f764-d9m27"] Apr 24 19:08:14.018087 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:14.018046 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed18fab9-300d-43bb-9afb-6e4c9b597e56" path="/var/lib/kubelet/pods/ed18fab9-300d-43bb-9afb-6e4c9b597e56/volumes" Apr 24 19:08:14.383811 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:14.383712 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:08:33.117870 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.117809 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-ccb4f9c5d-h5qnb" podUID="4c668914-260e-4c3f-9f14-f62056bc7b78" containerName="console" containerID="cri-o://ea2feb5211c1f4048292036d30a40f5eecfd3e02199238a6c8bbe7a6d07b94c5" gracePeriod=15 Apr 24 19:08:33.358315 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.358291 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-ccb4f9c5d-h5qnb_4c668914-260e-4c3f-9f14-f62056bc7b78/console/0.log" Apr 24 19:08:33.358452 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.358365 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:33.455376 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.455285 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c668914-260e-4c3f-9f14-f62056bc7b78-console-serving-cert\") pod \"4c668914-260e-4c3f-9f14-f62056bc7b78\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " Apr 24 19:08:33.455376 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.455347 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5m95\" (UniqueName: \"kubernetes.io/projected/4c668914-260e-4c3f-9f14-f62056bc7b78-kube-api-access-r5m95\") pod \"4c668914-260e-4c3f-9f14-f62056bc7b78\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " Apr 24 19:08:33.455376 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.455377 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-trusted-ca-bundle\") pod \"4c668914-260e-4c3f-9f14-f62056bc7b78\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " Apr 24 19:08:33.455656 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.455423 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c668914-260e-4c3f-9f14-f62056bc7b78-console-oauth-config\") pod \"4c668914-260e-4c3f-9f14-f62056bc7b78\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " Apr 24 19:08:33.455656 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.455448 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-oauth-serving-cert\") pod \"4c668914-260e-4c3f-9f14-f62056bc7b78\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " Apr 24 19:08:33.455656 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.455470 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-service-ca\") pod \"4c668914-260e-4c3f-9f14-f62056bc7b78\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " Apr 24 19:08:33.455656 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.455503 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-console-config\") pod \"4c668914-260e-4c3f-9f14-f62056bc7b78\" (UID: \"4c668914-260e-4c3f-9f14-f62056bc7b78\") " Apr 24 19:08:33.455994 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.455943 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4c668914-260e-4c3f-9f14-f62056bc7b78" (UID: "4c668914-260e-4c3f-9f14-f62056bc7b78"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:08:33.455994 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.455976 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4c668914-260e-4c3f-9f14-f62056bc7b78" (UID: "4c668914-260e-4c3f-9f14-f62056bc7b78"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:08:33.456327 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.456041 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-console-config" (OuterVolumeSpecName: "console-config") pod "4c668914-260e-4c3f-9f14-f62056bc7b78" (UID: "4c668914-260e-4c3f-9f14-f62056bc7b78"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:08:33.456327 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.456310 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-service-ca" (OuterVolumeSpecName: "service-ca") pod "4c668914-260e-4c3f-9f14-f62056bc7b78" (UID: "4c668914-260e-4c3f-9f14-f62056bc7b78"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:08:33.457829 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.457793 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c668914-260e-4c3f-9f14-f62056bc7b78-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4c668914-260e-4c3f-9f14-f62056bc7b78" (UID: "4c668914-260e-4c3f-9f14-f62056bc7b78"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:08:33.457933 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.457868 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c668914-260e-4c3f-9f14-f62056bc7b78-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4c668914-260e-4c3f-9f14-f62056bc7b78" (UID: "4c668914-260e-4c3f-9f14-f62056bc7b78"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:08:33.457933 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.457875 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c668914-260e-4c3f-9f14-f62056bc7b78-kube-api-access-r5m95" (OuterVolumeSpecName: "kube-api-access-r5m95") pod "4c668914-260e-4c3f-9f14-f62056bc7b78" (UID: "4c668914-260e-4c3f-9f14-f62056bc7b78"). InnerVolumeSpecName "kube-api-access-r5m95". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:08:33.480505 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.480482 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-ccb4f9c5d-h5qnb_4c668914-260e-4c3f-9f14-f62056bc7b78/console/0.log" Apr 24 19:08:33.480636 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.480520 2583 generic.go:358] "Generic (PLEG): container finished" podID="4c668914-260e-4c3f-9f14-f62056bc7b78" containerID="ea2feb5211c1f4048292036d30a40f5eecfd3e02199238a6c8bbe7a6d07b94c5" exitCode=2 Apr 24 19:08:33.480636 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.480560 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ccb4f9c5d-h5qnb" event={"ID":"4c668914-260e-4c3f-9f14-f62056bc7b78","Type":"ContainerDied","Data":"ea2feb5211c1f4048292036d30a40f5eecfd3e02199238a6c8bbe7a6d07b94c5"} Apr 24 19:08:33.480636 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.480583 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ccb4f9c5d-h5qnb" event={"ID":"4c668914-260e-4c3f-9f14-f62056bc7b78","Type":"ContainerDied","Data":"468467c277d8b344a3f3c3bc9722ebe924f5681689ed155cab314cfab946dc97"} Apr 24 19:08:33.480636 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.480590 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ccb4f9c5d-h5qnb" Apr 24 19:08:33.480763 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.480597 2583 scope.go:117] "RemoveContainer" containerID="ea2feb5211c1f4048292036d30a40f5eecfd3e02199238a6c8bbe7a6d07b94c5" Apr 24 19:08:33.489397 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.489381 2583 scope.go:117] "RemoveContainer" containerID="ea2feb5211c1f4048292036d30a40f5eecfd3e02199238a6c8bbe7a6d07b94c5" Apr 24 19:08:33.489633 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:08:33.489615 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea2feb5211c1f4048292036d30a40f5eecfd3e02199238a6c8bbe7a6d07b94c5\": container with ID starting with ea2feb5211c1f4048292036d30a40f5eecfd3e02199238a6c8bbe7a6d07b94c5 not found: ID does not exist" containerID="ea2feb5211c1f4048292036d30a40f5eecfd3e02199238a6c8bbe7a6d07b94c5" Apr 24 19:08:33.489677 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.489641 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea2feb5211c1f4048292036d30a40f5eecfd3e02199238a6c8bbe7a6d07b94c5"} err="failed to get container status \"ea2feb5211c1f4048292036d30a40f5eecfd3e02199238a6c8bbe7a6d07b94c5\": rpc error: code = NotFound desc = could not find container \"ea2feb5211c1f4048292036d30a40f5eecfd3e02199238a6c8bbe7a6d07b94c5\": container with ID starting with ea2feb5211c1f4048292036d30a40f5eecfd3e02199238a6c8bbe7a6d07b94c5 not found: ID does not exist" Apr 24 19:08:33.501387 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.501362 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-ccb4f9c5d-h5qnb"] Apr 24 19:08:33.510699 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.507520 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-ccb4f9c5d-h5qnb"] Apr 24 19:08:33.556115 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.556080 2583 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c668914-260e-4c3f-9f14-f62056bc7b78-console-serving-cert\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:08:33.556115 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.556115 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r5m95\" (UniqueName: \"kubernetes.io/projected/4c668914-260e-4c3f-9f14-f62056bc7b78-kube-api-access-r5m95\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:08:33.556309 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.556125 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-trusted-ca-bundle\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:08:33.556309 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.556134 2583 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c668914-260e-4c3f-9f14-f62056bc7b78-console-oauth-config\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:08:33.556309 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.556143 2583 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-oauth-serving-cert\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:08:33.556309 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.556153 2583 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-service-ca\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:08:33.556309 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:33.556162 2583 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c668914-260e-4c3f-9f14-f62056bc7b78-console-config\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:08:34.017677 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:08:34.017640 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c668914-260e-4c3f-9f14-f62056bc7b78" path="/var/lib/kubelet/pods/4c668914-260e-4c3f-9f14-f62056bc7b78/volumes" Apr 24 19:09:04.384012 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:04.383966 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:04.404094 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:04.404066 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:04.588376 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:04.588347 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:21.895683 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:21.895646 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 19:09:21.896154 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:21.896061 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="prometheus" containerID="cri-o://68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da" gracePeriod=600 Apr 24 19:09:21.896154 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:21.896118 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="kube-rbac-proxy-thanos" containerID="cri-o://e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36" gracePeriod=600 Apr 24 19:09:21.896323 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:21.896138 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="kube-rbac-proxy-web" containerID="cri-o://10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27" gracePeriod=600 Apr 24 19:09:21.896323 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:21.896190 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="config-reloader" containerID="cri-o://1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260" gracePeriod=600 Apr 24 19:09:21.896323 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:21.896166 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="kube-rbac-proxy" containerID="cri-o://a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8" gracePeriod=600 Apr 24 19:09:21.896472 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:21.896166 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="thanos-sidecar" containerID="cri-o://61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178" gracePeriod=600 Apr 24 19:09:22.138782 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.138759 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.228982 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.228901 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"c6563320-5bc9-4398-bfaa-802f73c524a3\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " Apr 24 19:09:22.228982 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.228954 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-configmap-serving-certs-ca-bundle\") pod \"c6563320-5bc9-4398-bfaa-802f73c524a3\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " Apr 24 19:09:22.229210 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.228985 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-metrics-client-certs\") pod \"c6563320-5bc9-4398-bfaa-802f73c524a3\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " Apr 24 19:09:22.229210 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.229013 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-thanos-prometheus-http-client-file\") pod \"c6563320-5bc9-4398-bfaa-802f73c524a3\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " Apr 24 19:09:22.229210 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.229055 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c6563320-5bc9-4398-bfaa-802f73c524a3-config-out\") pod \"c6563320-5bc9-4398-bfaa-802f73c524a3\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " Apr 24 19:09:22.229210 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.229079 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-configmap-metrics-client-ca\") pod \"c6563320-5bc9-4398-bfaa-802f73c524a3\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " Apr 24 19:09:22.229210 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.229114 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-trusted-ca-bundle\") pod \"c6563320-5bc9-4398-bfaa-802f73c524a3\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " Apr 24 19:09:22.229210 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.229138 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"c6563320-5bc9-4398-bfaa-802f73c524a3\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " Apr 24 19:09:22.229210 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.229161 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-config\") pod \"c6563320-5bc9-4398-bfaa-802f73c524a3\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " Apr 24 19:09:22.229210 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.229195 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-prometheus-k8s-tls\") pod \"c6563320-5bc9-4398-bfaa-802f73c524a3\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " Apr 24 19:09:22.229644 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.229233 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-web-config\") pod \"c6563320-5bc9-4398-bfaa-802f73c524a3\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " Apr 24 19:09:22.229644 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.229287 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-k8s-db\") pod \"c6563320-5bc9-4398-bfaa-802f73c524a3\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " Apr 24 19:09:22.229644 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.229328 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c6563320-5bc9-4398-bfaa-802f73c524a3-tls-assets\") pod \"c6563320-5bc9-4398-bfaa-802f73c524a3\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " Apr 24 19:09:22.229644 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.229378 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-kube-rbac-proxy\") pod \"c6563320-5bc9-4398-bfaa-802f73c524a3\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " Apr 24 19:09:22.229644 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.229409 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-grpc-tls\") pod \"c6563320-5bc9-4398-bfaa-802f73c524a3\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " Apr 24 19:09:22.229644 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.229437 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt5n6\" (UniqueName: \"kubernetes.io/projected/c6563320-5bc9-4398-bfaa-802f73c524a3-kube-api-access-bt5n6\") pod \"c6563320-5bc9-4398-bfaa-802f73c524a3\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " Apr 24 19:09:22.229644 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.229445 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "c6563320-5bc9-4398-bfaa-802f73c524a3" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:09:22.229644 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.229464 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-configmap-kubelet-serving-ca-bundle\") pod \"c6563320-5bc9-4398-bfaa-802f73c524a3\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " Apr 24 19:09:22.229644 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.229541 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-k8s-rulefiles-0\") pod \"c6563320-5bc9-4398-bfaa-802f73c524a3\" (UID: \"c6563320-5bc9-4398-bfaa-802f73c524a3\") " Apr 24 19:09:22.230243 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.229857 2583 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:09:22.230243 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.229939 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "c6563320-5bc9-4398-bfaa-802f73c524a3" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:09:22.230710 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.230410 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "c6563320-5bc9-4398-bfaa-802f73c524a3" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:09:22.230860 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.230835 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "c6563320-5bc9-4398-bfaa-802f73c524a3" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:09:22.230953 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.230933 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "c6563320-5bc9-4398-bfaa-802f73c524a3" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:09:22.231244 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.231214 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "c6563320-5bc9-4398-bfaa-802f73c524a3" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:09:22.233875 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.233852 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "c6563320-5bc9-4398-bfaa-802f73c524a3" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:09:22.234146 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.234056 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "c6563320-5bc9-4398-bfaa-802f73c524a3" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:09:22.234146 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.234101 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "c6563320-5bc9-4398-bfaa-802f73c524a3" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:09:22.234322 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.234192 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6563320-5bc9-4398-bfaa-802f73c524a3-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c6563320-5bc9-4398-bfaa-802f73c524a3" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:09:22.234434 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.234375 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "c6563320-5bc9-4398-bfaa-802f73c524a3" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:09:22.234583 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.234550 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6563320-5bc9-4398-bfaa-802f73c524a3-kube-api-access-bt5n6" (OuterVolumeSpecName: "kube-api-access-bt5n6") pod "c6563320-5bc9-4398-bfaa-802f73c524a3" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3"). InnerVolumeSpecName "kube-api-access-bt5n6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:09:22.234640 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.234575 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "c6563320-5bc9-4398-bfaa-802f73c524a3" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:09:22.234640 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.234590 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-config" (OuterVolumeSpecName: "config") pod "c6563320-5bc9-4398-bfaa-802f73c524a3" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:09:22.234733 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.234701 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6563320-5bc9-4398-bfaa-802f73c524a3-config-out" (OuterVolumeSpecName: "config-out") pod "c6563320-5bc9-4398-bfaa-802f73c524a3" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:09:22.235762 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.235734 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "c6563320-5bc9-4398-bfaa-802f73c524a3" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:09:22.236207 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.236169 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "c6563320-5bc9-4398-bfaa-802f73c524a3" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:09:22.244860 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.244839 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-web-config" (OuterVolumeSpecName: "web-config") pod "c6563320-5bc9-4398-bfaa-802f73c524a3" (UID: "c6563320-5bc9-4398-bfaa-802f73c524a3"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:09:22.330205 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.330157 2583 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-k8s-db\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:09:22.330205 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.330197 2583 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c6563320-5bc9-4398-bfaa-802f73c524a3-tls-assets\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:09:22.330205 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.330211 2583 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-kube-rbac-proxy\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:09:22.330507 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.330224 2583 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-grpc-tls\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:09:22.330507 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.330237 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bt5n6\" (UniqueName: \"kubernetes.io/projected/c6563320-5bc9-4398-bfaa-802f73c524a3-kube-api-access-bt5n6\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:09:22.330507 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.330279 2583 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:09:22.330507 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.330293 2583 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:09:22.330507 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.330305 2583 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:09:22.330507 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.330320 2583 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-metrics-client-certs\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:09:22.330507 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.330331 2583 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-thanos-prometheus-http-client-file\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:09:22.330507 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.330344 2583 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c6563320-5bc9-4398-bfaa-802f73c524a3-config-out\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:09:22.330507 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.330355 2583 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-configmap-metrics-client-ca\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:09:22.330507 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.330367 2583 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6563320-5bc9-4398-bfaa-802f73c524a3-prometheus-trusted-ca-bundle\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:09:22.330507 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.330380 2583 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:09:22.330507 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.330395 2583 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-config\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:09:22.330507 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.330408 2583 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-secret-prometheus-k8s-tls\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:09:22.330507 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.330420 2583 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c6563320-5bc9-4398-bfaa-802f73c524a3-web-config\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:09:22.628410 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.628374 2583 generic.go:358] "Generic (PLEG): container finished" podID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerID="e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36" exitCode=0 Apr 24 19:09:22.628410 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.628401 2583 generic.go:358] "Generic (PLEG): container finished" podID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerID="a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8" exitCode=0 Apr 24 19:09:22.628410 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.628408 2583 generic.go:358] "Generic (PLEG): container finished" podID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerID="10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27" exitCode=0 Apr 24 19:09:22.628410 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.628416 2583 generic.go:358] "Generic (PLEG): container finished" podID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerID="61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178" exitCode=0 Apr 24 19:09:22.628410 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.628421 2583 generic.go:358] "Generic (PLEG): container finished" podID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerID="1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260" exitCode=0 Apr 24 19:09:22.628745 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.628426 2583 generic.go:358] "Generic (PLEG): container finished" podID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerID="68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da" exitCode=0 Apr 24 19:09:22.628745 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.628457 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c6563320-5bc9-4398-bfaa-802f73c524a3","Type":"ContainerDied","Data":"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36"} Apr 24 19:09:22.628745 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.628497 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.628745 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.628507 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c6563320-5bc9-4398-bfaa-802f73c524a3","Type":"ContainerDied","Data":"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8"} Apr 24 19:09:22.628745 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.628523 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c6563320-5bc9-4398-bfaa-802f73c524a3","Type":"ContainerDied","Data":"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27"} Apr 24 19:09:22.628745 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.628535 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c6563320-5bc9-4398-bfaa-802f73c524a3","Type":"ContainerDied","Data":"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178"} Apr 24 19:09:22.628745 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.628548 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c6563320-5bc9-4398-bfaa-802f73c524a3","Type":"ContainerDied","Data":"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260"} Apr 24 19:09:22.628745 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.628561 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c6563320-5bc9-4398-bfaa-802f73c524a3","Type":"ContainerDied","Data":"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da"} Apr 24 19:09:22.628745 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.628570 2583 scope.go:117] "RemoveContainer" containerID="e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36" Apr 24 19:09:22.628745 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.628573 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c6563320-5bc9-4398-bfaa-802f73c524a3","Type":"ContainerDied","Data":"4ca0b23df5f9292632a38f11e4190e5fc614325426101c3b2cf8bc449405fe15"} Apr 24 19:09:22.636210 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.636190 2583 scope.go:117] "RemoveContainer" containerID="a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8" Apr 24 19:09:22.642870 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.642851 2583 scope.go:117] "RemoveContainer" containerID="10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27" Apr 24 19:09:22.649186 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.649143 2583 scope.go:117] "RemoveContainer" containerID="61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178" Apr 24 19:09:22.651970 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.651948 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 19:09:22.656491 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.656472 2583 scope.go:117] "RemoveContainer" containerID="1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260" Apr 24 19:09:22.656554 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.656514 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 19:09:22.663185 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.663165 2583 scope.go:117] "RemoveContainer" containerID="68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da" Apr 24 19:09:22.670129 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.670108 2583 scope.go:117] "RemoveContainer" containerID="c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54" Apr 24 19:09:22.676337 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.676321 2583 scope.go:117] "RemoveContainer" containerID="e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36" Apr 24 19:09:22.676590 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:09:22.676566 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36\": container with ID starting with e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36 not found: ID does not exist" containerID="e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36" Apr 24 19:09:22.676629 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.676593 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36"} err="failed to get container status \"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36\": rpc error: code = NotFound desc = could not find container \"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36\": container with ID starting with e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36 not found: ID does not exist" Apr 24 19:09:22.676629 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.676613 2583 scope.go:117] "RemoveContainer" containerID="a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8" Apr 24 19:09:22.676837 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:09:22.676821 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8\": container with ID starting with a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8 not found: ID does not exist" containerID="a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8" Apr 24 19:09:22.676876 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.676845 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8"} err="failed to get container status \"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8\": rpc error: code = NotFound desc = could not find container \"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8\": container with ID starting with a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8 not found: ID does not exist" Apr 24 19:09:22.676876 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.676862 2583 scope.go:117] "RemoveContainer" containerID="10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27" Apr 24 19:09:22.677099 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:09:22.677086 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27\": container with ID starting with 10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27 not found: ID does not exist" containerID="10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27" Apr 24 19:09:22.677147 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.677104 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27"} err="failed to get container status \"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27\": rpc error: code = NotFound desc = could not find container \"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27\": container with ID starting with 10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27 not found: ID does not exist" Apr 24 19:09:22.677147 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.677117 2583 scope.go:117] "RemoveContainer" containerID="61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178" Apr 24 19:09:22.677411 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:09:22.677389 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178\": container with ID starting with 61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178 not found: ID does not exist" containerID="61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178" Apr 24 19:09:22.677461 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.677428 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178"} err="failed to get container status \"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178\": rpc error: code = NotFound desc = could not find container \"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178\": container with ID starting with 61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178 not found: ID does not exist" Apr 24 19:09:22.677461 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.677444 2583 scope.go:117] "RemoveContainer" containerID="1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260" Apr 24 19:09:22.677680 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:09:22.677665 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260\": container with ID starting with 1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260 not found: ID does not exist" containerID="1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260" Apr 24 19:09:22.677715 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.677684 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260"} err="failed to get container status \"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260\": rpc error: code = NotFound desc = could not find container \"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260\": container with ID starting with 1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260 not found: ID does not exist" Apr 24 19:09:22.677715 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.677696 2583 scope.go:117] "RemoveContainer" containerID="68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da" Apr 24 19:09:22.678055 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:09:22.678032 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da\": container with ID starting with 68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da not found: ID does not exist" containerID="68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da" Apr 24 19:09:22.678139 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.678059 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da"} err="failed to get container status \"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da\": rpc error: code = NotFound desc = could not find container \"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da\": container with ID starting with 68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da not found: ID does not exist" Apr 24 19:09:22.678139 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.678079 2583 scope.go:117] "RemoveContainer" containerID="c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54" Apr 24 19:09:22.678381 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:09:22.678359 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54\": container with ID starting with c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54 not found: ID does not exist" containerID="c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54" Apr 24 19:09:22.678472 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.678388 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54"} err="failed to get container status \"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54\": rpc error: code = NotFound desc = could not find container \"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54\": container with ID starting with c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54 not found: ID does not exist" Apr 24 19:09:22.678472 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.678404 2583 scope.go:117] "RemoveContainer" containerID="e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36" Apr 24 19:09:22.678761 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.678740 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36"} err="failed to get container status \"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36\": rpc error: code = NotFound desc = could not find container \"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36\": container with ID starting with e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36 not found: ID does not exist" Apr 24 19:09:22.678761 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.678762 2583 scope.go:117] "RemoveContainer" containerID="a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8" Apr 24 19:09:22.678942 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.678925 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8"} err="failed to get container status \"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8\": rpc error: code = NotFound desc = could not find container \"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8\": container with ID starting with a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8 not found: ID does not exist" Apr 24 19:09:22.678994 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.678945 2583 scope.go:117] "RemoveContainer" containerID="10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27" Apr 24 19:09:22.679178 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.679162 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27"} err="failed to get container status \"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27\": rpc error: code = NotFound desc = could not find container \"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27\": container with ID starting with 10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27 not found: ID does not exist" Apr 24 19:09:22.679216 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.679179 2583 scope.go:117] "RemoveContainer" containerID="61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178" Apr 24 19:09:22.679415 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.679400 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178"} err="failed to get container status \"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178\": rpc error: code = NotFound desc = could not find container \"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178\": container with ID starting with 61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178 not found: ID does not exist" Apr 24 19:09:22.679465 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.679415 2583 scope.go:117] "RemoveContainer" containerID="1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260" Apr 24 19:09:22.679640 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.679622 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260"} err="failed to get container status \"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260\": rpc error: code = NotFound desc = could not find container \"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260\": container with ID starting with 1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260 not found: ID does not exist" Apr 24 19:09:22.679677 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.679642 2583 scope.go:117] "RemoveContainer" containerID="68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da" Apr 24 19:09:22.679862 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.679846 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da"} err="failed to get container status \"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da\": rpc error: code = NotFound desc = could not find container \"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da\": container with ID starting with 68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da not found: ID does not exist" Apr 24 19:09:22.679906 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.679863 2583 scope.go:117] "RemoveContainer" containerID="c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54" Apr 24 19:09:22.680031 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.680016 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54"} err="failed to get container status \"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54\": rpc error: code = NotFound desc = could not find container \"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54\": container with ID starting with c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54 not found: ID does not exist" Apr 24 19:09:22.680067 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.680032 2583 scope.go:117] "RemoveContainer" containerID="e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36" Apr 24 19:09:22.680220 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.680204 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36"} err="failed to get container status \"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36\": rpc error: code = NotFound desc = could not find container \"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36\": container with ID starting with e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36 not found: ID does not exist" Apr 24 19:09:22.680319 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.680221 2583 scope.go:117] "RemoveContainer" containerID="a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8" Apr 24 19:09:22.680472 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.680455 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8"} err="failed to get container status \"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8\": rpc error: code = NotFound desc = could not find container \"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8\": container with ID starting with a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8 not found: ID does not exist" Apr 24 19:09:22.680518 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.680473 2583 scope.go:117] "RemoveContainer" containerID="10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27" Apr 24 19:09:22.680687 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.680670 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27"} err="failed to get container status \"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27\": rpc error: code = NotFound desc = could not find container \"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27\": container with ID starting with 10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27 not found: ID does not exist" Apr 24 19:09:22.680725 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.680689 2583 scope.go:117] "RemoveContainer" containerID="61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178" Apr 24 19:09:22.680927 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.680905 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178"} err="failed to get container status \"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178\": rpc error: code = NotFound desc = could not find container \"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178\": container with ID starting with 61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178 not found: ID does not exist" Apr 24 19:09:22.680927 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.680926 2583 scope.go:117] "RemoveContainer" containerID="1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260" Apr 24 19:09:22.681158 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.681141 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260"} err="failed to get container status \"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260\": rpc error: code = NotFound desc = could not find container \"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260\": container with ID starting with 1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260 not found: ID does not exist" Apr 24 19:09:22.681158 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.681157 2583 scope.go:117] "RemoveContainer" containerID="68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da" Apr 24 19:09:22.681385 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.681370 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da"} err="failed to get container status \"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da\": rpc error: code = NotFound desc = could not find container \"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da\": container with ID starting with 68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da not found: ID does not exist" Apr 24 19:09:22.681434 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.681385 2583 scope.go:117] "RemoveContainer" containerID="c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54" Apr 24 19:09:22.681547 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.681532 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54"} err="failed to get container status \"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54\": rpc error: code = NotFound desc = could not find container \"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54\": container with ID starting with c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54 not found: ID does not exist" Apr 24 19:09:22.681598 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.681546 2583 scope.go:117] "RemoveContainer" containerID="e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36" Apr 24 19:09:22.681698 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.681684 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36"} err="failed to get container status \"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36\": rpc error: code = NotFound desc = could not find container \"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36\": container with ID starting with e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36 not found: ID does not exist" Apr 24 19:09:22.681742 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.681698 2583 scope.go:117] "RemoveContainer" containerID="a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8" Apr 24 19:09:22.681889 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.681870 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8"} err="failed to get container status \"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8\": rpc error: code = NotFound desc = could not find container \"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8\": container with ID starting with a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8 not found: ID does not exist" Apr 24 19:09:22.681959 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.681891 2583 scope.go:117] "RemoveContainer" containerID="10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27" Apr 24 19:09:22.682088 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.682071 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27"} err="failed to get container status \"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27\": rpc error: code = NotFound desc = could not find container \"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27\": container with ID starting with 10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27 not found: ID does not exist" Apr 24 19:09:22.682131 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.682089 2583 scope.go:117] "RemoveContainer" containerID="61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178" Apr 24 19:09:22.682278 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.682239 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178"} err="failed to get container status \"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178\": rpc error: code = NotFound desc = could not find container \"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178\": container with ID starting with 61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178 not found: ID does not exist" Apr 24 19:09:22.682278 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.682276 2583 scope.go:117] "RemoveContainer" containerID="1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260" Apr 24 19:09:22.682487 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.682470 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260"} err="failed to get container status \"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260\": rpc error: code = NotFound desc = could not find container \"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260\": container with ID starting with 1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260 not found: ID does not exist" Apr 24 19:09:22.682531 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.682489 2583 scope.go:117] "RemoveContainer" containerID="68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da" Apr 24 19:09:22.682709 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.682693 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da"} err="failed to get container status \"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da\": rpc error: code = NotFound desc = could not find container \"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da\": container with ID starting with 68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da not found: ID does not exist" Apr 24 19:09:22.682761 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.682710 2583 scope.go:117] "RemoveContainer" containerID="c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54" Apr 24 19:09:22.682915 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.682898 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54"} err="failed to get container status \"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54\": rpc error: code = NotFound desc = could not find container \"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54\": container with ID starting with c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54 not found: ID does not exist" Apr 24 19:09:22.682954 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.682916 2583 scope.go:117] "RemoveContainer" containerID="e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36" Apr 24 19:09:22.683131 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.683112 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36"} err="failed to get container status \"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36\": rpc error: code = NotFound desc = could not find container \"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36\": container with ID starting with e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36 not found: ID does not exist" Apr 24 19:09:22.683180 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.683131 2583 scope.go:117] "RemoveContainer" containerID="a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8" Apr 24 19:09:22.683330 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.683313 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8"} err="failed to get container status \"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8\": rpc error: code = NotFound desc = could not find container \"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8\": container with ID starting with a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8 not found: ID does not exist" Apr 24 19:09:22.683397 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.683331 2583 scope.go:117] "RemoveContainer" containerID="10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27" Apr 24 19:09:22.683536 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.683520 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27"} err="failed to get container status \"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27\": rpc error: code = NotFound desc = could not find container \"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27\": container with ID starting with 10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27 not found: ID does not exist" Apr 24 19:09:22.683536 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.683534 2583 scope.go:117] "RemoveContainer" containerID="61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178" Apr 24 19:09:22.683742 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.683726 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178"} err="failed to get container status \"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178\": rpc error: code = NotFound desc = could not find container \"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178\": container with ID starting with 61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178 not found: ID does not exist" Apr 24 19:09:22.683783 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.683742 2583 scope.go:117] "RemoveContainer" containerID="1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260" Apr 24 19:09:22.683935 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.683922 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260"} err="failed to get container status \"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260\": rpc error: code = NotFound desc = could not find container \"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260\": container with ID starting with 1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260 not found: ID does not exist" Apr 24 19:09:22.683975 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.683935 2583 scope.go:117] "RemoveContainer" containerID="68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da" Apr 24 19:09:22.684130 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.684113 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da"} err="failed to get container status \"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da\": rpc error: code = NotFound desc = could not find container \"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da\": container with ID starting with 68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da not found: ID does not exist" Apr 24 19:09:22.684179 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.684129 2583 scope.go:117] "RemoveContainer" containerID="c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54" Apr 24 19:09:22.684390 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.684374 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54"} err="failed to get container status \"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54\": rpc error: code = NotFound desc = could not find container \"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54\": container with ID starting with c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54 not found: ID does not exist" Apr 24 19:09:22.684446 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.684390 2583 scope.go:117] "RemoveContainer" containerID="e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36" Apr 24 19:09:22.684592 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.684575 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36"} err="failed to get container status \"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36\": rpc error: code = NotFound desc = could not find container \"e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36\": container with ID starting with e9884e59fa15f2e3b204c276a1a0969835d6b4c00d79c7029015219dd694db36 not found: ID does not exist" Apr 24 19:09:22.684637 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.684592 2583 scope.go:117] "RemoveContainer" containerID="a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8" Apr 24 19:09:22.684760 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.684744 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8"} err="failed to get container status \"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8\": rpc error: code = NotFound desc = could not find container \"a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8\": container with ID starting with a860ed41fe318dc4573e1212abeff410cc250258ba8fd533ea53ef4dbe8551e8 not found: ID does not exist" Apr 24 19:09:22.684797 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.684760 2583 scope.go:117] "RemoveContainer" containerID="10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27" Apr 24 19:09:22.684991 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.684972 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27"} err="failed to get container status \"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27\": rpc error: code = NotFound desc = could not find container \"10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27\": container with ID starting with 10b4f0fc8265811ad499608d1231e37ab7706d6fe4bfd34bfd50eb9169b44d27 not found: ID does not exist" Apr 24 19:09:22.685033 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.684992 2583 scope.go:117] "RemoveContainer" containerID="61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178" Apr 24 19:09:22.685213 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.685197 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178"} err="failed to get container status \"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178\": rpc error: code = NotFound desc = could not find container \"61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178\": container with ID starting with 61d50acdc802757cfe2ee05d176f8684c3ee7d41642382eb712998218e438178 not found: ID does not exist" Apr 24 19:09:22.685276 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.685214 2583 scope.go:117] "RemoveContainer" containerID="1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260" Apr 24 19:09:22.685435 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.685415 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260"} err="failed to get container status \"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260\": rpc error: code = NotFound desc = could not find container \"1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260\": container with ID starting with 1a6f12542266665ad659de94da7cae751a2e1007d0f7eddcb6007371384c4260 not found: ID does not exist" Apr 24 19:09:22.685473 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.685435 2583 scope.go:117] "RemoveContainer" containerID="68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da" Apr 24 19:09:22.685627 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.685611 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da"} err="failed to get container status \"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da\": rpc error: code = NotFound desc = could not find container \"68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da\": container with ID starting with 68ce6270ecf293c17fdf9edbc2650295b6a3f4421cf7a09a1f606120512769da not found: ID does not exist" Apr 24 19:09:22.685664 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.685627 2583 scope.go:117] "RemoveContainer" containerID="c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54" Apr 24 19:09:22.685805 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.685788 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54"} err="failed to get container status \"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54\": rpc error: code = NotFound desc = could not find container \"c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54\": container with ID starting with c1b5da57cdc55a1c67c784d87e0bb7c9ab5c7a6b6093c15a29888634e8514e54 not found: ID does not exist" Apr 24 19:09:22.688377 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688358 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 19:09:22.688650 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688637 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="init-config-reloader" Apr 24 19:09:22.688685 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688655 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="init-config-reloader" Apr 24 19:09:22.688685 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688674 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="config-reloader" Apr 24 19:09:22.688685 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688681 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="config-reloader" Apr 24 19:09:22.688780 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688689 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="prometheus" Apr 24 19:09:22.688780 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688695 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="prometheus" Apr 24 19:09:22.688780 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688701 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="kube-rbac-proxy-web" Apr 24 19:09:22.688780 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688706 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="kube-rbac-proxy-web" Apr 24 19:09:22.688780 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688712 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="kube-rbac-proxy" Apr 24 19:09:22.688780 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688717 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="kube-rbac-proxy" Apr 24 19:09:22.688780 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688723 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="thanos-sidecar" Apr 24 19:09:22.688780 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688732 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="thanos-sidecar" Apr 24 19:09:22.688780 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688744 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c668914-260e-4c3f-9f14-f62056bc7b78" containerName="console" Apr 24 19:09:22.688780 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688749 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c668914-260e-4c3f-9f14-f62056bc7b78" containerName="console" Apr 24 19:09:22.688780 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688756 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed18fab9-300d-43bb-9afb-6e4c9b597e56" containerName="registry" Apr 24 19:09:22.688780 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688760 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed18fab9-300d-43bb-9afb-6e4c9b597e56" containerName="registry" Apr 24 19:09:22.688780 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688765 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="kube-rbac-proxy-thanos" Apr 24 19:09:22.688780 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688770 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="kube-rbac-proxy-thanos" Apr 24 19:09:22.689158 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688809 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="prometheus" Apr 24 19:09:22.689158 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688820 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="kube-rbac-proxy" Apr 24 19:09:22.689158 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688826 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="config-reloader" Apr 24 19:09:22.689158 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688832 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="kube-rbac-proxy-web" Apr 24 19:09:22.689158 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688837 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed18fab9-300d-43bb-9afb-6e4c9b597e56" containerName="registry" Apr 24 19:09:22.689158 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688842 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c668914-260e-4c3f-9f14-f62056bc7b78" containerName="console" Apr 24 19:09:22.689158 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688847 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="kube-rbac-proxy-thanos" Apr 24 19:09:22.689158 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.688853 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" containerName="thanos-sidecar" Apr 24 19:09:22.693911 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.693892 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.696358 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.696308 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 19:09:22.696358 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.696353 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 19:09:22.696676 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.696547 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 19:09:22.696676 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.696618 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 19:09:22.696676 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.696639 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 19:09:22.696676 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.696662 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 19:09:22.696887 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.696850 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 19:09:22.696887 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.696870 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 19:09:22.697053 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.696929 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 19:09:22.697053 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.696976 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-91d61ujimh4dk\"" Apr 24 19:09:22.697152 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.697120 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-kxpfh\"" Apr 24 19:09:22.697240 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.697156 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 19:09:22.697650 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.697633 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 19:09:22.701064 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.701043 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 19:09:22.703781 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.703764 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 19:09:22.712506 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.712485 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 19:09:22.733293 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.733263 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ff63673d-1ba8-4037-8ce8-9e7e038a5468-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.733438 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.733306 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-config\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.733438 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.733364 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ff63673d-1ba8-4037-8ce8-9e7e038a5468-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.733438 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.733399 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff63673d-1ba8-4037-8ce8-9e7e038a5468-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.733438 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.733434 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.733568 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.733464 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.733568 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.733489 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.733568 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.733506 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff63673d-1ba8-4037-8ce8-9e7e038a5468-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.733568 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.733525 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.733680 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.733595 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ff63673d-1ba8-4037-8ce8-9e7e038a5468-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.733680 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.733612 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v69tk\" (UniqueName: \"kubernetes.io/projected/ff63673d-1ba8-4037-8ce8-9e7e038a5468-kube-api-access-v69tk\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.733680 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.733629 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ff63673d-1ba8-4037-8ce8-9e7e038a5468-config-out\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.733680 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.733655 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-web-config\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.733794 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.733694 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.733794 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.733720 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.733794 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.733738 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff63673d-1ba8-4037-8ce8-9e7e038a5468-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.733794 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.733756 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.733794 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.733777 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff63673d-1ba8-4037-8ce8-9e7e038a5468-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.835024 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.834986 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.835024 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835031 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.835437 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835055 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff63673d-1ba8-4037-8ce8-9e7e038a5468-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.835437 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835074 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.835437 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835095 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff63673d-1ba8-4037-8ce8-9e7e038a5468-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.835437 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835111 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ff63673d-1ba8-4037-8ce8-9e7e038a5468-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.835437 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835144 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-config\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.835437 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835168 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ff63673d-1ba8-4037-8ce8-9e7e038a5468-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.835437 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835184 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff63673d-1ba8-4037-8ce8-9e7e038a5468-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.835437 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835228 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.835437 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835294 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.835437 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835320 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.835437 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835343 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff63673d-1ba8-4037-8ce8-9e7e038a5468-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.835437 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835367 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.835437 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835390 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ff63673d-1ba8-4037-8ce8-9e7e038a5468-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.835437 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835414 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v69tk\" (UniqueName: \"kubernetes.io/projected/ff63673d-1ba8-4037-8ce8-9e7e038a5468-kube-api-access-v69tk\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.835437 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835442 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ff63673d-1ba8-4037-8ce8-9e7e038a5468-config-out\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.836165 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835467 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-web-config\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.836165 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835604 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ff63673d-1ba8-4037-8ce8-9e7e038a5468-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.836165 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.835911 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff63673d-1ba8-4037-8ce8-9e7e038a5468-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.836165 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.836047 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff63673d-1ba8-4037-8ce8-9e7e038a5468-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.836383 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.836296 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff63673d-1ba8-4037-8ce8-9e7e038a5468-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.837866 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.837792 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff63673d-1ba8-4037-8ce8-9e7e038a5468-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.838886 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.838858 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-web-config\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.838977 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.838949 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.840066 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.839799 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-config\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.840066 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.839832 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ff63673d-1ba8-4037-8ce8-9e7e038a5468-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.840066 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.839955 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.840066 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.839999 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.840363 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.840104 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ff63673d-1ba8-4037-8ce8-9e7e038a5468-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.840363 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.840187 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.840901 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.840874 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.841033 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.841018 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ff63673d-1ba8-4037-8ce8-9e7e038a5468-config-out\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.841141 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.841125 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.841369 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.841351 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ff63673d-1ba8-4037-8ce8-9e7e038a5468-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:22.844939 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:22.844920 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v69tk\" (UniqueName: \"kubernetes.io/projected/ff63673d-1ba8-4037-8ce8-9e7e038a5468-kube-api-access-v69tk\") pod \"prometheus-k8s-0\" (UID: \"ff63673d-1ba8-4037-8ce8-9e7e038a5468\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:23.005029 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:23.004989 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:09:23.144822 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:23.144791 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 19:09:23.147106 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:09:23.147075 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff63673d_1ba8_4037_8ce8_9e7e038a5468.slice/crio-6aece5ac1865a6f67b8f34d792725ae0ebeae4cf287679517e93694bdcb19c32 WatchSource:0}: Error finding container 6aece5ac1865a6f67b8f34d792725ae0ebeae4cf287679517e93694bdcb19c32: Status 404 returned error can't find the container with id 6aece5ac1865a6f67b8f34d792725ae0ebeae4cf287679517e93694bdcb19c32 Apr 24 19:09:23.634631 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:23.634599 2583 generic.go:358] "Generic (PLEG): container finished" podID="ff63673d-1ba8-4037-8ce8-9e7e038a5468" containerID="c2fc34a9d85833457e8530df42ae67a2d6d7e1579a772c07913a07286cf02728" exitCode=0 Apr 24 19:09:23.634785 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:23.634686 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ff63673d-1ba8-4037-8ce8-9e7e038a5468","Type":"ContainerDied","Data":"c2fc34a9d85833457e8530df42ae67a2d6d7e1579a772c07913a07286cf02728"} Apr 24 19:09:23.634785 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:23.634719 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ff63673d-1ba8-4037-8ce8-9e7e038a5468","Type":"ContainerStarted","Data":"6aece5ac1865a6f67b8f34d792725ae0ebeae4cf287679517e93694bdcb19c32"} Apr 24 19:09:24.020452 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:24.020424 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6563320-5bc9-4398-bfaa-802f73c524a3" path="/var/lib/kubelet/pods/c6563320-5bc9-4398-bfaa-802f73c524a3/volumes" Apr 24 19:09:24.640802 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:24.640762 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ff63673d-1ba8-4037-8ce8-9e7e038a5468","Type":"ContainerStarted","Data":"9bd3246c96d30d1c94da5b5b024c0435b5c5973812734527d528d0690cc0c2f4"} Apr 24 19:09:24.640802 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:24.640807 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ff63673d-1ba8-4037-8ce8-9e7e038a5468","Type":"ContainerStarted","Data":"8b2655f78307969a7fe93a7c5eb62cea477c0ac5a039bb0be54734fcf8590b65"} Apr 24 19:09:24.640999 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:24.640821 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ff63673d-1ba8-4037-8ce8-9e7e038a5468","Type":"ContainerStarted","Data":"28afad3429ece4757d74f3775f58f5d194e97f014a9fb3be61e3c51504e7111f"} Apr 24 19:09:24.640999 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:24.640833 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ff63673d-1ba8-4037-8ce8-9e7e038a5468","Type":"ContainerStarted","Data":"34c8375a74498cd567bb778af93fbed4adf14f385931157dd12633b81e2f33d0"} Apr 24 19:09:24.640999 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:24.640846 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ff63673d-1ba8-4037-8ce8-9e7e038a5468","Type":"ContainerStarted","Data":"f5f23fcb110cff42cbba2184908efae926527072694b27789be8fac77a3e36f0"} Apr 24 19:09:24.640999 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:24.640858 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ff63673d-1ba8-4037-8ce8-9e7e038a5468","Type":"ContainerStarted","Data":"4471cdd89bc2deb6d506eacda426b89ac229ab62e73b8d08f93e6e2badd043a1"} Apr 24 19:09:24.676901 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:24.676844 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.676824383 podStartE2EDuration="2.676824383s" podCreationTimestamp="2026-04-24 19:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:09:24.674942117 +0000 UTC m=+163.280233570" watchObservedRunningTime="2026-04-24 19:09:24.676824383 +0000 UTC m=+163.282115800" Apr 24 19:09:24.751214 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:09:24.750851 2583 configmap.go:193] Couldn't get configMap openshift-monitoring/prometheus-k8s-rulefiles-0: configmap "prometheus-k8s-rulefiles-0" not found Apr 24 19:09:24.751214 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:09:24.750954 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ff63673d-1ba8-4037-8ce8-9e7e038a5468-prometheus-k8s-rulefiles-0 podName:ff63673d-1ba8-4037-8ce8-9e7e038a5468 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:25.250929587 +0000 UTC m=+163.856220988 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-k8s-rulefiles-0" (UniqueName: "kubernetes.io/configmap/ff63673d-1ba8-4037-8ce8-9e7e038a5468-prometheus-k8s-rulefiles-0") pod "prometheus-k8s-0" (UID: "ff63673d-1ba8-4037-8ce8-9e7e038a5468") : configmap "prometheus-k8s-rulefiles-0" not found Apr 24 19:09:28.006000 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:09:28.005952 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:23.005691 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:10:23.005649 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:23.021312 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:10:23.021280 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:10:23.821439 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:10:23.821410 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 19:11:02.059580 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:02.059491 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8fr9p"] Apr 24 19:11:02.068298 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:02.068264 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8fr9p" Apr 24 19:11:02.070861 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:02.070828 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8fr9p"] Apr 24 19:11:02.071124 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:02.071105 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 19:11:02.180620 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:02.180583 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ed54a18d-5919-4a5b-a984-d1d44435a869-kubelet-config\") pod \"global-pull-secret-syncer-8fr9p\" (UID: \"ed54a18d-5919-4a5b-a984-d1d44435a869\") " pod="kube-system/global-pull-secret-syncer-8fr9p" Apr 24 19:11:02.180620 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:02.180629 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ed54a18d-5919-4a5b-a984-d1d44435a869-dbus\") pod \"global-pull-secret-syncer-8fr9p\" (UID: \"ed54a18d-5919-4a5b-a984-d1d44435a869\") " pod="kube-system/global-pull-secret-syncer-8fr9p" Apr 24 19:11:02.180879 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:02.180714 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ed54a18d-5919-4a5b-a984-d1d44435a869-original-pull-secret\") pod \"global-pull-secret-syncer-8fr9p\" (UID: \"ed54a18d-5919-4a5b-a984-d1d44435a869\") " pod="kube-system/global-pull-secret-syncer-8fr9p" Apr 24 19:11:02.281835 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:02.281797 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ed54a18d-5919-4a5b-a984-d1d44435a869-kubelet-config\") pod \"global-pull-secret-syncer-8fr9p\" (UID: \"ed54a18d-5919-4a5b-a984-d1d44435a869\") " pod="kube-system/global-pull-secret-syncer-8fr9p" Apr 24 19:11:02.281835 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:02.281842 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ed54a18d-5919-4a5b-a984-d1d44435a869-dbus\") pod \"global-pull-secret-syncer-8fr9p\" (UID: \"ed54a18d-5919-4a5b-a984-d1d44435a869\") " pod="kube-system/global-pull-secret-syncer-8fr9p" Apr 24 19:11:02.282065 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:02.281872 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ed54a18d-5919-4a5b-a984-d1d44435a869-original-pull-secret\") pod \"global-pull-secret-syncer-8fr9p\" (UID: \"ed54a18d-5919-4a5b-a984-d1d44435a869\") " pod="kube-system/global-pull-secret-syncer-8fr9p" Apr 24 19:11:02.282065 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:02.281949 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ed54a18d-5919-4a5b-a984-d1d44435a869-kubelet-config\") pod \"global-pull-secret-syncer-8fr9p\" (UID: \"ed54a18d-5919-4a5b-a984-d1d44435a869\") " pod="kube-system/global-pull-secret-syncer-8fr9p" Apr 24 19:11:02.282065 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:02.281999 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ed54a18d-5919-4a5b-a984-d1d44435a869-dbus\") pod \"global-pull-secret-syncer-8fr9p\" (UID: \"ed54a18d-5919-4a5b-a984-d1d44435a869\") " pod="kube-system/global-pull-secret-syncer-8fr9p" Apr 24 19:11:02.284302 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:02.284284 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ed54a18d-5919-4a5b-a984-d1d44435a869-original-pull-secret\") pod \"global-pull-secret-syncer-8fr9p\" (UID: \"ed54a18d-5919-4a5b-a984-d1d44435a869\") " pod="kube-system/global-pull-secret-syncer-8fr9p" Apr 24 19:11:02.379803 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:02.379720 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8fr9p" Apr 24 19:11:02.504728 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:02.504689 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8fr9p"] Apr 24 19:11:02.507854 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:11:02.507827 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded54a18d_5919_4a5b_a984_d1d44435a869.slice/crio-2d6bf8d4e10b4c5b3730fa7f1a4666d8940de70cc212abffdf4c67c48f5289d2 WatchSource:0}: Error finding container 2d6bf8d4e10b4c5b3730fa7f1a4666d8940de70cc212abffdf4c67c48f5289d2: Status 404 returned error can't find the container with id 2d6bf8d4e10b4c5b3730fa7f1a4666d8940de70cc212abffdf4c67c48f5289d2 Apr 24 19:11:02.905546 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:02.905508 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8fr9p" event={"ID":"ed54a18d-5919-4a5b-a984-d1d44435a869","Type":"ContainerStarted","Data":"2d6bf8d4e10b4c5b3730fa7f1a4666d8940de70cc212abffdf4c67c48f5289d2"} Apr 24 19:11:06.917945 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:06.917906 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8fr9p" event={"ID":"ed54a18d-5919-4a5b-a984-d1d44435a869","Type":"ContainerStarted","Data":"c77c6bfe5c0cbdcde581ad7d71a2f53201ea6caa28a20ea0951c490f8028a628"} Apr 24 19:11:06.933576 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:06.933436 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8fr9p" podStartSLOduration=1.032957224 podStartE2EDuration="4.933418234s" podCreationTimestamp="2026-04-24 19:11:02 +0000 UTC" firstStartedPulling="2026-04-24 19:11:02.509934981 +0000 UTC m=+261.115226375" lastFinishedPulling="2026-04-24 19:11:06.410395988 +0000 UTC m=+265.015687385" observedRunningTime="2026-04-24 19:11:06.932908117 +0000 UTC m=+265.538199532" watchObservedRunningTime="2026-04-24 19:11:06.933418234 +0000 UTC m=+265.538709651" Apr 24 19:11:41.883820 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:41.883789 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/ovn-acl-logging/0.log" Apr 24 19:11:41.884366 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:41.883881 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/ovn-acl-logging/0.log" Apr 24 19:11:41.887246 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:11:41.887223 2583 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 19:14:20.298084 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:20.298040 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-744jm"] Apr 24 19:14:20.301316 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:20.301299 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-744jm" Apr 24 19:14:20.304084 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:20.304057 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 19:14:20.304220 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:20.304157 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 19:14:20.305346 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:20.305313 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 19:14:20.305346 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:20.305324 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-f7gfl\"" Apr 24 19:14:20.308044 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:20.308005 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-744jm"] Apr 24 19:14:20.454561 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:20.454519 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8w5n\" (UniqueName: \"kubernetes.io/projected/7058f84d-aa34-4d2d-8e89-f3d5cce653a0-kube-api-access-d8w5n\") pod \"s3-init-744jm\" (UID: \"7058f84d-aa34-4d2d-8e89-f3d5cce653a0\") " pod="kserve/s3-init-744jm" Apr 24 19:14:20.556037 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:20.555945 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8w5n\" (UniqueName: \"kubernetes.io/projected/7058f84d-aa34-4d2d-8e89-f3d5cce653a0-kube-api-access-d8w5n\") pod \"s3-init-744jm\" (UID: \"7058f84d-aa34-4d2d-8e89-f3d5cce653a0\") " pod="kserve/s3-init-744jm" Apr 24 19:14:20.567721 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:20.567687 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8w5n\" (UniqueName: \"kubernetes.io/projected/7058f84d-aa34-4d2d-8e89-f3d5cce653a0-kube-api-access-d8w5n\") pod \"s3-init-744jm\" (UID: \"7058f84d-aa34-4d2d-8e89-f3d5cce653a0\") " pod="kserve/s3-init-744jm" Apr 24 19:14:20.618583 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:20.618542 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-744jm" Apr 24 19:14:20.740636 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:20.740599 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-744jm"] Apr 24 19:14:20.743625 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:14:20.743595 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7058f84d_aa34_4d2d_8e89_f3d5cce653a0.slice/crio-17476ffe103adb1434973e51b26cd8967ac22ca99018ed4cab12c4ce81ec44fb WatchSource:0}: Error finding container 17476ffe103adb1434973e51b26cd8967ac22ca99018ed4cab12c4ce81ec44fb: Status 404 returned error can't find the container with id 17476ffe103adb1434973e51b26cd8967ac22ca99018ed4cab12c4ce81ec44fb Apr 24 19:14:20.745862 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:20.745844 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:14:21.455142 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:21.455101 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-744jm" event={"ID":"7058f84d-aa34-4d2d-8e89-f3d5cce653a0","Type":"ContainerStarted","Data":"17476ffe103adb1434973e51b26cd8967ac22ca99018ed4cab12c4ce81ec44fb"} Apr 24 19:14:25.468139 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:25.468041 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-744jm" event={"ID":"7058f84d-aa34-4d2d-8e89-f3d5cce653a0","Type":"ContainerStarted","Data":"a872c4cf29ea858c045eec3b98ce04fd18d42854d4d6823e3e93a77ca08dcebd"} Apr 24 19:14:25.483096 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:25.483046 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-744jm" podStartSLOduration=1.056119178 podStartE2EDuration="5.483030769s" podCreationTimestamp="2026-04-24 19:14:20 +0000 UTC" firstStartedPulling="2026-04-24 19:14:20.7459976 +0000 UTC m=+459.351288995" lastFinishedPulling="2026-04-24 19:14:25.172909178 +0000 UTC m=+463.778200586" observedRunningTime="2026-04-24 19:14:25.482200557 +0000 UTC m=+464.087491983" watchObservedRunningTime="2026-04-24 19:14:25.483030769 +0000 UTC m=+464.088322185" Apr 24 19:14:28.476767 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:28.476731 2583 generic.go:358] "Generic (PLEG): container finished" podID="7058f84d-aa34-4d2d-8e89-f3d5cce653a0" containerID="a872c4cf29ea858c045eec3b98ce04fd18d42854d4d6823e3e93a77ca08dcebd" exitCode=0 Apr 24 19:14:28.477165 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:28.476802 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-744jm" event={"ID":"7058f84d-aa34-4d2d-8e89-f3d5cce653a0","Type":"ContainerDied","Data":"a872c4cf29ea858c045eec3b98ce04fd18d42854d4d6823e3e93a77ca08dcebd"} Apr 24 19:14:29.610879 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:29.610855 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-744jm" Apr 24 19:14:29.736946 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:29.736850 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8w5n\" (UniqueName: \"kubernetes.io/projected/7058f84d-aa34-4d2d-8e89-f3d5cce653a0-kube-api-access-d8w5n\") pod \"7058f84d-aa34-4d2d-8e89-f3d5cce653a0\" (UID: \"7058f84d-aa34-4d2d-8e89-f3d5cce653a0\") " Apr 24 19:14:29.739167 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:29.739144 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7058f84d-aa34-4d2d-8e89-f3d5cce653a0-kube-api-access-d8w5n" (OuterVolumeSpecName: "kube-api-access-d8w5n") pod "7058f84d-aa34-4d2d-8e89-f3d5cce653a0" (UID: "7058f84d-aa34-4d2d-8e89-f3d5cce653a0"). InnerVolumeSpecName "kube-api-access-d8w5n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:14:29.837506 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:29.837465 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d8w5n\" (UniqueName: \"kubernetes.io/projected/7058f84d-aa34-4d2d-8e89-f3d5cce653a0-kube-api-access-d8w5n\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:14:30.482887 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:30.482796 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-744jm" event={"ID":"7058f84d-aa34-4d2d-8e89-f3d5cce653a0","Type":"ContainerDied","Data":"17476ffe103adb1434973e51b26cd8967ac22ca99018ed4cab12c4ce81ec44fb"} Apr 24 19:14:30.482887 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:30.482816 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-744jm" Apr 24 19:14:30.482887 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:14:30.482834 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17476ffe103adb1434973e51b26cd8967ac22ca99018ed4cab12c4ce81ec44fb" Apr 24 19:16:41.901920 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:16:41.901890 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/ovn-acl-logging/0.log" Apr 24 19:16:41.902432 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:16:41.902383 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/ovn-acl-logging/0.log" Apr 24 19:19:45.107027 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.106987 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v"] Apr 24 19:19:45.107751 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.107450 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7058f84d-aa34-4d2d-8e89-f3d5cce653a0" containerName="s3-init" Apr 24 19:19:45.107751 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.107520 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="7058f84d-aa34-4d2d-8e89-f3d5cce653a0" containerName="s3-init" Apr 24 19:19:45.107751 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.107585 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="7058f84d-aa34-4d2d-8e89-f3d5cce653a0" containerName="s3-init" Apr 24 19:19:45.110644 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.110623 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" Apr 24 19:19:45.113073 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.113045 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-1911c-kube-rbac-proxy-sar-config\"" Apr 24 19:19:45.113204 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.113089 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 19:19:45.113204 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.113158 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pjv9w\"" Apr 24 19:19:45.113565 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.113545 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-1911c-serving-cert\"" Apr 24 19:19:45.119386 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.119364 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v"] Apr 24 19:19:45.266824 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.266787 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36c4bc0b-9659-434f-aa75-1340d759cef0-proxy-tls\") pod \"model-chainer-raw-hpa-1911c-589fc8b564-dz99v\" (UID: \"36c4bc0b-9659-434f-aa75-1340d759cef0\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" Apr 24 19:19:45.266824 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.266836 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36c4bc0b-9659-434f-aa75-1340d759cef0-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-1911c-589fc8b564-dz99v\" (UID: \"36c4bc0b-9659-434f-aa75-1340d759cef0\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" Apr 24 19:19:45.367297 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.367196 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36c4bc0b-9659-434f-aa75-1340d759cef0-proxy-tls\") pod \"model-chainer-raw-hpa-1911c-589fc8b564-dz99v\" (UID: \"36c4bc0b-9659-434f-aa75-1340d759cef0\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" Apr 24 19:19:45.367297 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.367233 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36c4bc0b-9659-434f-aa75-1340d759cef0-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-1911c-589fc8b564-dz99v\" (UID: \"36c4bc0b-9659-434f-aa75-1340d759cef0\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" Apr 24 19:19:45.367870 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.367849 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36c4bc0b-9659-434f-aa75-1340d759cef0-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-1911c-589fc8b564-dz99v\" (UID: \"36c4bc0b-9659-434f-aa75-1340d759cef0\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" Apr 24 19:19:45.369825 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.369800 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36c4bc0b-9659-434f-aa75-1340d759cef0-proxy-tls\") pod \"model-chainer-raw-hpa-1911c-589fc8b564-dz99v\" (UID: \"36c4bc0b-9659-434f-aa75-1340d759cef0\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" Apr 24 19:19:45.422090 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.422049 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" Apr 24 19:19:45.544920 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.544832 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v"] Apr 24 19:19:45.547513 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:19:45.547484 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36c4bc0b_9659_434f_aa75_1340d759cef0.slice/crio-70204f22b14f4485bf896be55be6c9ce297ea6a78553ab043c425af00d20a068 WatchSource:0}: Error finding container 70204f22b14f4485bf896be55be6c9ce297ea6a78553ab043c425af00d20a068: Status 404 returned error can't find the container with id 70204f22b14f4485bf896be55be6c9ce297ea6a78553ab043c425af00d20a068 Apr 24 19:19:45.549306 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:45.549239 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:19:46.342993 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:46.342940 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" event={"ID":"36c4bc0b-9659-434f-aa75-1340d759cef0","Type":"ContainerStarted","Data":"70204f22b14f4485bf896be55be6c9ce297ea6a78553ab043c425af00d20a068"} Apr 24 19:19:48.349411 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:48.349371 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" event={"ID":"36c4bc0b-9659-434f-aa75-1340d759cef0","Type":"ContainerStarted","Data":"80e9eafd50b58e7a9f02ead102a31a1e2412ef8ab65dd2480c28751ab466dd97"} Apr 24 19:19:48.349887 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:48.349501 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" Apr 24 19:19:48.366234 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:48.366186 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" podStartSLOduration=1.020424751 podStartE2EDuration="3.366170535s" podCreationTimestamp="2026-04-24 19:19:45 +0000 UTC" firstStartedPulling="2026-04-24 19:19:45.54941686 +0000 UTC m=+784.154708254" lastFinishedPulling="2026-04-24 19:19:47.895162641 +0000 UTC m=+786.500454038" observedRunningTime="2026-04-24 19:19:48.365202539 +0000 UTC m=+786.970493955" watchObservedRunningTime="2026-04-24 19:19:48.366170535 +0000 UTC m=+786.971461950" Apr 24 19:19:54.357910 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:54.357881 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" Apr 24 19:19:55.170064 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:55.170025 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v"] Apr 24 19:19:55.170288 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:55.170243 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" podUID="36c4bc0b-9659-434f-aa75-1340d759cef0" containerName="model-chainer-raw-hpa-1911c" containerID="cri-o://80e9eafd50b58e7a9f02ead102a31a1e2412ef8ab65dd2480c28751ab466dd97" gracePeriod=30 Apr 24 19:19:59.356023 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:19:59.355938 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" podUID="36c4bc0b-9659-434f-aa75-1340d759cef0" containerName="model-chainer-raw-hpa-1911c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:04.357883 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:04.357842 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" podUID="36c4bc0b-9659-434f-aa75-1340d759cef0" containerName="model-chainer-raw-hpa-1911c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:09.356446 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:09.356399 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" podUID="36c4bc0b-9659-434f-aa75-1340d759cef0" containerName="model-chainer-raw-hpa-1911c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:09.356903 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:09.356518 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" Apr 24 19:20:14.355855 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:14.355812 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" podUID="36c4bc0b-9659-434f-aa75-1340d759cef0" containerName="model-chainer-raw-hpa-1911c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:19.356666 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:19.356627 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" podUID="36c4bc0b-9659-434f-aa75-1340d759cef0" containerName="model-chainer-raw-hpa-1911c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:24.356275 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:24.356215 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" podUID="36c4bc0b-9659-434f-aa75-1340d759cef0" containerName="model-chainer-raw-hpa-1911c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:25.451937 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:25.451846 2583 generic.go:358] "Generic (PLEG): container finished" podID="36c4bc0b-9659-434f-aa75-1340d759cef0" containerID="80e9eafd50b58e7a9f02ead102a31a1e2412ef8ab65dd2480c28751ab466dd97" exitCode=0 Apr 24 19:20:25.451937 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:25.451927 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" event={"ID":"36c4bc0b-9659-434f-aa75-1340d759cef0","Type":"ContainerDied","Data":"80e9eafd50b58e7a9f02ead102a31a1e2412ef8ab65dd2480c28751ab466dd97"} Apr 24 19:20:25.807485 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:25.807461 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" Apr 24 19:20:25.861876 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:25.861842 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36c4bc0b-9659-434f-aa75-1340d759cef0-openshift-service-ca-bundle\") pod \"36c4bc0b-9659-434f-aa75-1340d759cef0\" (UID: \"36c4bc0b-9659-434f-aa75-1340d759cef0\") " Apr 24 19:20:25.862057 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:25.861888 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36c4bc0b-9659-434f-aa75-1340d759cef0-proxy-tls\") pod \"36c4bc0b-9659-434f-aa75-1340d759cef0\" (UID: \"36c4bc0b-9659-434f-aa75-1340d759cef0\") " Apr 24 19:20:25.862309 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:25.862267 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36c4bc0b-9659-434f-aa75-1340d759cef0-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "36c4bc0b-9659-434f-aa75-1340d759cef0" (UID: "36c4bc0b-9659-434f-aa75-1340d759cef0"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:20:25.864213 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:25.864192 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36c4bc0b-9659-434f-aa75-1340d759cef0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "36c4bc0b-9659-434f-aa75-1340d759cef0" (UID: "36c4bc0b-9659-434f-aa75-1340d759cef0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:20:25.962581 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:25.962544 2583 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36c4bc0b-9659-434f-aa75-1340d759cef0-openshift-service-ca-bundle\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:20:25.962581 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:25.962574 2583 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36c4bc0b-9659-434f-aa75-1340d759cef0-proxy-tls\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:20:26.455201 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:26.455116 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" Apr 24 19:20:26.455599 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:26.455119 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v" event={"ID":"36c4bc0b-9659-434f-aa75-1340d759cef0","Type":"ContainerDied","Data":"70204f22b14f4485bf896be55be6c9ce297ea6a78553ab043c425af00d20a068"} Apr 24 19:20:26.455599 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:26.455233 2583 scope.go:117] "RemoveContainer" containerID="80e9eafd50b58e7a9f02ead102a31a1e2412ef8ab65dd2480c28751ab466dd97" Apr 24 19:20:26.469896 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:26.469868 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v"] Apr 24 19:20:26.473461 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:26.473435 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-1911c-589fc8b564-dz99v"] Apr 24 19:20:28.017615 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:20:28.017584 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c4bc0b-9659-434f-aa75-1340d759cef0" path="/var/lib/kubelet/pods/36c4bc0b-9659-434f-aa75-1340d759cef0/volumes" Apr 24 19:21:41.919225 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:21:41.919198 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/ovn-acl-logging/0.log" Apr 24 19:21:41.921598 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:21:41.921577 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/ovn-acl-logging/0.log" Apr 24 19:26:41.939545 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:26:41.939512 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/ovn-acl-logging/0.log" Apr 24 19:26:41.943045 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:26:41.943023 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/ovn-acl-logging/0.log" Apr 24 19:28:33.797734 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:33.797699 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8gt5z/must-gather-dsg88"] Apr 24 19:28:33.799743 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:33.797982 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36c4bc0b-9659-434f-aa75-1340d759cef0" containerName="model-chainer-raw-hpa-1911c" Apr 24 19:28:33.799743 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:33.797994 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c4bc0b-9659-434f-aa75-1340d759cef0" containerName="model-chainer-raw-hpa-1911c" Apr 24 19:28:33.799743 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:33.798050 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="36c4bc0b-9659-434f-aa75-1340d759cef0" containerName="model-chainer-raw-hpa-1911c" Apr 24 19:28:33.800559 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:33.800539 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8gt5z/must-gather-dsg88" Apr 24 19:28:33.803005 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:33.802975 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8gt5z\"/\"default-dockercfg-rtskl\"" Apr 24 19:28:33.803143 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:33.803064 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8gt5z\"/\"openshift-service-ca.crt\"" Apr 24 19:28:33.803914 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:33.803898 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8gt5z\"/\"kube-root-ca.crt\"" Apr 24 19:28:33.808398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:33.808178 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8gt5z/must-gather-dsg88"] Apr 24 19:28:33.841035 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:33.840996 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcjtn\" (UniqueName: \"kubernetes.io/projected/b8d462f0-0656-4a95-8f70-8e0ff0aa325e-kube-api-access-hcjtn\") pod \"must-gather-dsg88\" (UID: \"b8d462f0-0656-4a95-8f70-8e0ff0aa325e\") " pod="openshift-must-gather-8gt5z/must-gather-dsg88" Apr 24 19:28:33.841199 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:33.841049 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8d462f0-0656-4a95-8f70-8e0ff0aa325e-must-gather-output\") pod \"must-gather-dsg88\" (UID: \"b8d462f0-0656-4a95-8f70-8e0ff0aa325e\") " pod="openshift-must-gather-8gt5z/must-gather-dsg88" Apr 24 19:28:33.942465 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:33.942434 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcjtn\" (UniqueName: \"kubernetes.io/projected/b8d462f0-0656-4a95-8f70-8e0ff0aa325e-kube-api-access-hcjtn\") pod \"must-gather-dsg88\" (UID: \"b8d462f0-0656-4a95-8f70-8e0ff0aa325e\") " pod="openshift-must-gather-8gt5z/must-gather-dsg88" Apr 24 19:28:33.942600 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:33.942474 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8d462f0-0656-4a95-8f70-8e0ff0aa325e-must-gather-output\") pod \"must-gather-dsg88\" (UID: \"b8d462f0-0656-4a95-8f70-8e0ff0aa325e\") " pod="openshift-must-gather-8gt5z/must-gather-dsg88" Apr 24 19:28:33.942834 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:33.942815 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8d462f0-0656-4a95-8f70-8e0ff0aa325e-must-gather-output\") pod \"must-gather-dsg88\" (UID: \"b8d462f0-0656-4a95-8f70-8e0ff0aa325e\") " pod="openshift-must-gather-8gt5z/must-gather-dsg88" Apr 24 19:28:33.951203 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:33.951172 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcjtn\" (UniqueName: \"kubernetes.io/projected/b8d462f0-0656-4a95-8f70-8e0ff0aa325e-kube-api-access-hcjtn\") pod \"must-gather-dsg88\" (UID: \"b8d462f0-0656-4a95-8f70-8e0ff0aa325e\") " pod="openshift-must-gather-8gt5z/must-gather-dsg88" Apr 24 19:28:34.117626 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:34.117527 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8gt5z/must-gather-dsg88" Apr 24 19:28:34.239961 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:34.239925 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8gt5z/must-gather-dsg88"] Apr 24 19:28:34.243307 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:28:34.243280 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8d462f0_0656_4a95_8f70_8e0ff0aa325e.slice/crio-1fdab10a591aec853dc83ca3dfc73a78341b288fbc60166c5bf0187bcf8d7b71 WatchSource:0}: Error finding container 1fdab10a591aec853dc83ca3dfc73a78341b288fbc60166c5bf0187bcf8d7b71: Status 404 returned error can't find the container with id 1fdab10a591aec853dc83ca3dfc73a78341b288fbc60166c5bf0187bcf8d7b71 Apr 24 19:28:34.245473 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:34.245452 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:28:34.773308 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:34.773269 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8gt5z/must-gather-dsg88" event={"ID":"b8d462f0-0656-4a95-8f70-8e0ff0aa325e","Type":"ContainerStarted","Data":"1fdab10a591aec853dc83ca3dfc73a78341b288fbc60166c5bf0187bcf8d7b71"} Apr 24 19:28:39.788484 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:39.788442 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8gt5z/must-gather-dsg88" event={"ID":"b8d462f0-0656-4a95-8f70-8e0ff0aa325e","Type":"ContainerStarted","Data":"f54ac32fc5dc1b3c3aa10badde270f99c42aa8dfcedbce5c4788b4d9a17a4f55"} Apr 24 19:28:39.788484 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:39.788491 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8gt5z/must-gather-dsg88" event={"ID":"b8d462f0-0656-4a95-8f70-8e0ff0aa325e","Type":"ContainerStarted","Data":"db455b09a37299b47796bb9133c91fbe40fe140eeda26f9a24a51402b048a59a"} Apr 24 19:28:39.808039 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:39.807989 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8gt5z/must-gather-dsg88" podStartSLOduration=1.7737877530000001 podStartE2EDuration="6.807972592s" podCreationTimestamp="2026-04-24 19:28:33 +0000 UTC" firstStartedPulling="2026-04-24 19:28:34.245633228 +0000 UTC m=+1312.850924630" lastFinishedPulling="2026-04-24 19:28:39.279818061 +0000 UTC m=+1317.885109469" observedRunningTime="2026-04-24 19:28:39.806579818 +0000 UTC m=+1318.411871235" watchObservedRunningTime="2026-04-24 19:28:39.807972592 +0000 UTC m=+1318.413264052" Apr 24 19:28:56.842538 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:56.842504 2583 generic.go:358] "Generic (PLEG): container finished" podID="b8d462f0-0656-4a95-8f70-8e0ff0aa325e" containerID="db455b09a37299b47796bb9133c91fbe40fe140eeda26f9a24a51402b048a59a" exitCode=0 Apr 24 19:28:56.842956 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:56.842556 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8gt5z/must-gather-dsg88" event={"ID":"b8d462f0-0656-4a95-8f70-8e0ff0aa325e","Type":"ContainerDied","Data":"db455b09a37299b47796bb9133c91fbe40fe140eeda26f9a24a51402b048a59a"} Apr 24 19:28:56.842956 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:56.842859 2583 scope.go:117] "RemoveContainer" containerID="db455b09a37299b47796bb9133c91fbe40fe140eeda26f9a24a51402b048a59a" Apr 24 19:28:57.737712 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:28:57.737683 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8gt5z_must-gather-dsg88_b8d462f0-0656-4a95-8f70-8e0ff0aa325e/gather/0.log" Apr 24 19:29:00.870268 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:00.870127 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8fr9p_ed54a18d-5919-4a5b-a984-d1d44435a869/global-pull-secret-syncer/0.log" Apr 24 19:29:01.075831 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:01.075801 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-fpdqf_1daf02a1-51a2-4eb1-a1b5-ed9667d63027/konnectivity-agent/0.log" Apr 24 19:29:01.095591 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:01.095555 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-23.ec2.internal_7361f60877a2c10988a706360ce354df/haproxy/0.log" Apr 24 19:29:03.117342 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.117304 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8gt5z/must-gather-dsg88"] Apr 24 19:29:03.117734 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.117538 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-8gt5z/must-gather-dsg88" podUID="b8d462f0-0656-4a95-8f70-8e0ff0aa325e" containerName="copy" containerID="cri-o://f54ac32fc5dc1b3c3aa10badde270f99c42aa8dfcedbce5c4788b4d9a17a4f55" gracePeriod=2 Apr 24 19:29:03.122081 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.121461 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8gt5z/must-gather-dsg88"] Apr 24 19:29:03.343053 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.343030 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8gt5z_must-gather-dsg88_b8d462f0-0656-4a95-8f70-8e0ff0aa325e/copy/0.log" Apr 24 19:29:03.343403 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.343386 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8gt5z/must-gather-dsg88" Apr 24 19:29:03.345798 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.345774 2583 status_manager.go:895] "Failed to get status for pod" podUID="b8d462f0-0656-4a95-8f70-8e0ff0aa325e" pod="openshift-must-gather-8gt5z/must-gather-dsg88" err="pods \"must-gather-dsg88\" is forbidden: User \"system:node:ip-10-0-137-23.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-8gt5z\": no relationship found between node 'ip-10-0-137-23.ec2.internal' and this object" Apr 24 19:29:03.488289 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.488167 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcjtn\" (UniqueName: \"kubernetes.io/projected/b8d462f0-0656-4a95-8f70-8e0ff0aa325e-kube-api-access-hcjtn\") pod \"b8d462f0-0656-4a95-8f70-8e0ff0aa325e\" (UID: \"b8d462f0-0656-4a95-8f70-8e0ff0aa325e\") " Apr 24 19:29:03.488456 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.488295 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8d462f0-0656-4a95-8f70-8e0ff0aa325e-must-gather-output\") pod \"b8d462f0-0656-4a95-8f70-8e0ff0aa325e\" (UID: \"b8d462f0-0656-4a95-8f70-8e0ff0aa325e\") " Apr 24 19:29:03.489662 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.489632 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8d462f0-0656-4a95-8f70-8e0ff0aa325e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b8d462f0-0656-4a95-8f70-8e0ff0aa325e" (UID: "b8d462f0-0656-4a95-8f70-8e0ff0aa325e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:29:03.490725 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.490705 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8d462f0-0656-4a95-8f70-8e0ff0aa325e-kube-api-access-hcjtn" (OuterVolumeSpecName: "kube-api-access-hcjtn") pod "b8d462f0-0656-4a95-8f70-8e0ff0aa325e" (UID: "b8d462f0-0656-4a95-8f70-8e0ff0aa325e"). InnerVolumeSpecName "kube-api-access-hcjtn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:29:03.589303 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.589241 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hcjtn\" (UniqueName: \"kubernetes.io/projected/b8d462f0-0656-4a95-8f70-8e0ff0aa325e-kube-api-access-hcjtn\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:29:03.589303 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.589298 2583 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8d462f0-0656-4a95-8f70-8e0ff0aa325e-must-gather-output\") on node \"ip-10-0-137-23.ec2.internal\" DevicePath \"\"" Apr 24 19:29:03.860650 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.860621 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8gt5z_must-gather-dsg88_b8d462f0-0656-4a95-8f70-8e0ff0aa325e/copy/0.log" Apr 24 19:29:03.860964 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.860943 2583 generic.go:358] "Generic (PLEG): container finished" podID="b8d462f0-0656-4a95-8f70-8e0ff0aa325e" containerID="f54ac32fc5dc1b3c3aa10badde270f99c42aa8dfcedbce5c4788b4d9a17a4f55" exitCode=143 Apr 24 19:29:03.861030 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.860990 2583 scope.go:117] "RemoveContainer" containerID="f54ac32fc5dc1b3c3aa10badde270f99c42aa8dfcedbce5c4788b4d9a17a4f55" Apr 24 19:29:03.861030 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.860990 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8gt5z/must-gather-dsg88" Apr 24 19:29:03.863379 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.863349 2583 status_manager.go:895] "Failed to get status for pod" podUID="b8d462f0-0656-4a95-8f70-8e0ff0aa325e" pod="openshift-must-gather-8gt5z/must-gather-dsg88" err="pods \"must-gather-dsg88\" is forbidden: User \"system:node:ip-10-0-137-23.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-8gt5z\": no relationship found between node 'ip-10-0-137-23.ec2.internal' and this object" Apr 24 19:29:03.868732 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.868712 2583 scope.go:117] "RemoveContainer" containerID="db455b09a37299b47796bb9133c91fbe40fe140eeda26f9a24a51402b048a59a" Apr 24 19:29:03.871152 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.871127 2583 status_manager.go:895] "Failed to get status for pod" podUID="b8d462f0-0656-4a95-8f70-8e0ff0aa325e" pod="openshift-must-gather-8gt5z/must-gather-dsg88" err="pods \"must-gather-dsg88\" is forbidden: User \"system:node:ip-10-0-137-23.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-8gt5z\": no relationship found between node 'ip-10-0-137-23.ec2.internal' and this object" Apr 24 19:29:03.880989 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.880970 2583 scope.go:117] "RemoveContainer" containerID="f54ac32fc5dc1b3c3aa10badde270f99c42aa8dfcedbce5c4788b4d9a17a4f55" Apr 24 19:29:03.881329 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:29:03.881245 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f54ac32fc5dc1b3c3aa10badde270f99c42aa8dfcedbce5c4788b4d9a17a4f55\": container with ID starting with f54ac32fc5dc1b3c3aa10badde270f99c42aa8dfcedbce5c4788b4d9a17a4f55 not found: ID does not exist" containerID="f54ac32fc5dc1b3c3aa10badde270f99c42aa8dfcedbce5c4788b4d9a17a4f55" Apr 24 19:29:03.881416 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.881342 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f54ac32fc5dc1b3c3aa10badde270f99c42aa8dfcedbce5c4788b4d9a17a4f55"} err="failed to get container status \"f54ac32fc5dc1b3c3aa10badde270f99c42aa8dfcedbce5c4788b4d9a17a4f55\": rpc error: code = NotFound desc = could not find container \"f54ac32fc5dc1b3c3aa10badde270f99c42aa8dfcedbce5c4788b4d9a17a4f55\": container with ID starting with f54ac32fc5dc1b3c3aa10badde270f99c42aa8dfcedbce5c4788b4d9a17a4f55 not found: ID does not exist" Apr 24 19:29:03.881416 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.881384 2583 scope.go:117] "RemoveContainer" containerID="db455b09a37299b47796bb9133c91fbe40fe140eeda26f9a24a51402b048a59a" Apr 24 19:29:03.881645 ip-10-0-137-23 kubenswrapper[2583]: E0424 19:29:03.881628 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db455b09a37299b47796bb9133c91fbe40fe140eeda26f9a24a51402b048a59a\": container with ID starting with db455b09a37299b47796bb9133c91fbe40fe140eeda26f9a24a51402b048a59a not found: ID does not exist" containerID="db455b09a37299b47796bb9133c91fbe40fe140eeda26f9a24a51402b048a59a" Apr 24 19:29:03.881681 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:03.881652 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db455b09a37299b47796bb9133c91fbe40fe140eeda26f9a24a51402b048a59a"} err="failed to get container status \"db455b09a37299b47796bb9133c91fbe40fe140eeda26f9a24a51402b048a59a\": rpc error: code = NotFound desc = could not find container \"db455b09a37299b47796bb9133c91fbe40fe140eeda26f9a24a51402b048a59a\": container with ID starting with db455b09a37299b47796bb9133c91fbe40fe140eeda26f9a24a51402b048a59a not found: ID does not exist" Apr 24 19:29:04.017965 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:04.017926 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8d462f0-0656-4a95-8f70-8e0ff0aa325e" path="/var/lib/kubelet/pods/b8d462f0-0656-4a95-8f70-8e0ff0aa325e/volumes" Apr 24 19:29:04.928914 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:04.928830 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-n2p7v_c2485ee1-2466-4a88-b136-103a66abef2f/kube-state-metrics/0.log" Apr 24 19:29:04.948331 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:04.948304 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-n2p7v_c2485ee1-2466-4a88-b136-103a66abef2f/kube-rbac-proxy-main/0.log" Apr 24 19:29:04.969054 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:04.969013 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-n2p7v_c2485ee1-2466-4a88-b136-103a66abef2f/kube-rbac-proxy-self/0.log" Apr 24 19:29:05.044026 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:05.043996 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cmsk6_efda0d48-2fab-4250-bd44-8d6f6bc536e2/node-exporter/0.log" Apr 24 19:29:05.064075 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:05.064043 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cmsk6_efda0d48-2fab-4250-bd44-8d6f6bc536e2/kube-rbac-proxy/0.log" Apr 24 19:29:05.083658 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:05.083634 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cmsk6_efda0d48-2fab-4250-bd44-8d6f6bc536e2/init-textfile/0.log" Apr 24 19:29:05.306780 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:05.306737 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ff63673d-1ba8-4037-8ce8-9e7e038a5468/prometheus/0.log" Apr 24 19:29:05.323127 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:05.323092 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ff63673d-1ba8-4037-8ce8-9e7e038a5468/config-reloader/0.log" Apr 24 19:29:05.342662 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:05.342623 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ff63673d-1ba8-4037-8ce8-9e7e038a5468/thanos-sidecar/0.log" Apr 24 19:29:05.364774 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:05.364751 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ff63673d-1ba8-4037-8ce8-9e7e038a5468/kube-rbac-proxy-web/0.log" Apr 24 19:29:05.383655 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:05.383621 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ff63673d-1ba8-4037-8ce8-9e7e038a5468/kube-rbac-proxy/0.log" Apr 24 19:29:05.400678 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:05.400651 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ff63673d-1ba8-4037-8ce8-9e7e038a5468/kube-rbac-proxy-thanos/0.log" Apr 24 19:29:05.419532 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:05.419512 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ff63673d-1ba8-4037-8ce8-9e7e038a5468/init-config-reloader/0.log" Apr 24 19:29:08.241702 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.241669 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n"] Apr 24 19:29:08.242066 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.241952 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8d462f0-0656-4a95-8f70-8e0ff0aa325e" containerName="gather" Apr 24 19:29:08.242066 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.241963 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d462f0-0656-4a95-8f70-8e0ff0aa325e" containerName="gather" Apr 24 19:29:08.242066 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.241973 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8d462f0-0656-4a95-8f70-8e0ff0aa325e" containerName="copy" Apr 24 19:29:08.242066 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.241979 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d462f0-0656-4a95-8f70-8e0ff0aa325e" containerName="copy" Apr 24 19:29:08.242066 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.242030 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8d462f0-0656-4a95-8f70-8e0ff0aa325e" containerName="copy" Apr 24 19:29:08.242066 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.242039 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8d462f0-0656-4a95-8f70-8e0ff0aa325e" containerName="gather" Apr 24 19:29:08.244966 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.244944 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:08.247287 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.247246 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jzlrb\"/\"openshift-service-ca.crt\"" Apr 24 19:29:08.248214 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.248197 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jzlrb\"/\"kube-root-ca.crt\"" Apr 24 19:29:08.248349 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.248199 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jzlrb\"/\"default-dockercfg-df8sl\"" Apr 24 19:29:08.252711 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.252689 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n"] Apr 24 19:29:08.323893 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.323859 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d766defa-68d9-42c7-bc88-3b20add291f5-podres\") pod \"perf-node-gather-daemonset-zmr6n\" (UID: \"d766defa-68d9-42c7-bc88-3b20add291f5\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:08.323893 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.323894 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d766defa-68d9-42c7-bc88-3b20add291f5-sys\") pod \"perf-node-gather-daemonset-zmr6n\" (UID: \"d766defa-68d9-42c7-bc88-3b20add291f5\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:08.324101 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.323935 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d766defa-68d9-42c7-bc88-3b20add291f5-lib-modules\") pod \"perf-node-gather-daemonset-zmr6n\" (UID: \"d766defa-68d9-42c7-bc88-3b20add291f5\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:08.324101 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.323971 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d766defa-68d9-42c7-bc88-3b20add291f5-proc\") pod \"perf-node-gather-daemonset-zmr6n\" (UID: \"d766defa-68d9-42c7-bc88-3b20add291f5\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:08.324101 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.323987 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m2lb\" (UniqueName: \"kubernetes.io/projected/d766defa-68d9-42c7-bc88-3b20add291f5-kube-api-access-2m2lb\") pod \"perf-node-gather-daemonset-zmr6n\" (UID: \"d766defa-68d9-42c7-bc88-3b20add291f5\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:08.424700 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.424665 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d766defa-68d9-42c7-bc88-3b20add291f5-podres\") pod \"perf-node-gather-daemonset-zmr6n\" (UID: \"d766defa-68d9-42c7-bc88-3b20add291f5\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:08.424700 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.424701 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d766defa-68d9-42c7-bc88-3b20add291f5-sys\") pod \"perf-node-gather-daemonset-zmr6n\" (UID: \"d766defa-68d9-42c7-bc88-3b20add291f5\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:08.424952 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.424750 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d766defa-68d9-42c7-bc88-3b20add291f5-lib-modules\") pod \"perf-node-gather-daemonset-zmr6n\" (UID: \"d766defa-68d9-42c7-bc88-3b20add291f5\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:08.424952 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.424775 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d766defa-68d9-42c7-bc88-3b20add291f5-proc\") pod \"perf-node-gather-daemonset-zmr6n\" (UID: \"d766defa-68d9-42c7-bc88-3b20add291f5\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:08.424952 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.424796 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2m2lb\" (UniqueName: \"kubernetes.io/projected/d766defa-68d9-42c7-bc88-3b20add291f5-kube-api-access-2m2lb\") pod \"perf-node-gather-daemonset-zmr6n\" (UID: \"d766defa-68d9-42c7-bc88-3b20add291f5\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:08.424952 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.424846 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d766defa-68d9-42c7-bc88-3b20add291f5-podres\") pod \"perf-node-gather-daemonset-zmr6n\" (UID: \"d766defa-68d9-42c7-bc88-3b20add291f5\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:08.424952 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.424846 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d766defa-68d9-42c7-bc88-3b20add291f5-sys\") pod \"perf-node-gather-daemonset-zmr6n\" (UID: \"d766defa-68d9-42c7-bc88-3b20add291f5\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:08.424952 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.424880 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d766defa-68d9-42c7-bc88-3b20add291f5-proc\") pod \"perf-node-gather-daemonset-zmr6n\" (UID: \"d766defa-68d9-42c7-bc88-3b20add291f5\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:08.424952 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.424930 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d766defa-68d9-42c7-bc88-3b20add291f5-lib-modules\") pod \"perf-node-gather-daemonset-zmr6n\" (UID: \"d766defa-68d9-42c7-bc88-3b20add291f5\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:08.432626 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.432590 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m2lb\" (UniqueName: \"kubernetes.io/projected/d766defa-68d9-42c7-bc88-3b20add291f5-kube-api-access-2m2lb\") pod \"perf-node-gather-daemonset-zmr6n\" (UID: \"d766defa-68d9-42c7-bc88-3b20add291f5\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:08.556431 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.556400 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:08.672236 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.672194 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n"] Apr 24 19:29:08.675431 ip-10-0-137-23 kubenswrapper[2583]: W0424 19:29:08.675405 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd766defa_68d9_42c7_bc88_3b20add291f5.slice/crio-09f359bacbebc2055d8c192f1a33bc00dfb85127ebd5a58c21b6f620cd1ecffb WatchSource:0}: Error finding container 09f359bacbebc2055d8c192f1a33bc00dfb85127ebd5a58c21b6f620cd1ecffb: Status 404 returned error can't find the container with id 09f359bacbebc2055d8c192f1a33bc00dfb85127ebd5a58c21b6f620cd1ecffb Apr 24 19:29:08.878705 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.878668 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" event={"ID":"d766defa-68d9-42c7-bc88-3b20add291f5","Type":"ContainerStarted","Data":"960a4c7aacfbe467caf2093c70e95defccd8e0bf03319df9253f69394232dade"} Apr 24 19:29:08.878705 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.878708 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" event={"ID":"d766defa-68d9-42c7-bc88-3b20add291f5","Type":"ContainerStarted","Data":"09f359bacbebc2055d8c192f1a33bc00dfb85127ebd5a58c21b6f620cd1ecffb"} Apr 24 19:29:08.878951 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.878784 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:08.898863 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.898815 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" podStartSLOduration=0.898798909 podStartE2EDuration="898.798909ms" podCreationTimestamp="2026-04-24 19:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:29:08.897692447 +0000 UTC m=+1347.502983866" watchObservedRunningTime="2026-04-24 19:29:08.898798909 +0000 UTC m=+1347.504090324" Apr 24 19:29:08.915158 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.915119 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zl4tq_c680e87e-93f7-42bb-8645-95c2ba5b415e/dns/0.log" Apr 24 19:29:08.934780 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.934757 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zl4tq_c680e87e-93f7-42bb-8645-95c2ba5b415e/kube-rbac-proxy/0.log" Apr 24 19:29:08.960142 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:08.960113 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lclhm_23ef5eca-c6e0-4efd-9dd6-fe47c8b220e1/dns-node-resolver/0.log" Apr 24 19:29:09.452216 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:09.452173 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-887cb6fd5-fs5mk_635da0a1-4f63-4344-b2a2-310bcbfbe50c/registry/0.log" Apr 24 19:29:09.520104 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:09.520076 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mm5fz_c359622a-4d36-4dcb-b06d-e8b0a4c453ad/node-ca/0.log" Apr 24 19:29:10.609110 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:10.609078 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-p67rd_e08e9f5e-5277-4567-a799-97a88665243a/serve-healthcheck-canary/0.log" Apr 24 19:29:11.045500 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:11.045464 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6vdfs_e5839401-04b4-44b8-a2ca-b823d81b3ac6/kube-rbac-proxy/0.log" Apr 24 19:29:11.066130 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:11.066100 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6vdfs_e5839401-04b4-44b8-a2ca-b823d81b3ac6/exporter/0.log" Apr 24 19:29:11.100752 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:11.100724 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6vdfs_e5839401-04b4-44b8-a2ca-b823d81b3ac6/extractor/0.log" Apr 24 19:29:13.263790 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:13.263754 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-744jm_7058f84d-aa34-4d2d-8e89-f3d5cce653a0/s3-init/0.log" Apr 24 19:29:14.891446 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:14.891412 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-zmr6n" Apr 24 19:29:16.937518 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:16.937440 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-rlrgb_21f31c0d-2e5a-4ef1-bb98-2dbf4e3074cb/migrator/0.log" Apr 24 19:29:16.960572 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:16.960547 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-rlrgb_21f31c0d-2e5a-4ef1-bb98-2dbf4e3074cb/graceful-termination/0.log" Apr 24 19:29:18.621422 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:18.621392 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdfd7_2a4ef86a-6412-43fa-ba15-979962cfdfad/kube-multus-additional-cni-plugins/0.log" Apr 24 19:29:18.640787 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:18.640757 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdfd7_2a4ef86a-6412-43fa-ba15-979962cfdfad/egress-router-binary-copy/0.log" Apr 24 19:29:18.660215 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:18.660186 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdfd7_2a4ef86a-6412-43fa-ba15-979962cfdfad/cni-plugins/0.log" Apr 24 19:29:18.679116 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:18.679086 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdfd7_2a4ef86a-6412-43fa-ba15-979962cfdfad/bond-cni-plugin/0.log" Apr 24 19:29:18.699243 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:18.699216 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdfd7_2a4ef86a-6412-43fa-ba15-979962cfdfad/routeoverride-cni/0.log" Apr 24 19:29:18.718286 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:18.718233 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdfd7_2a4ef86a-6412-43fa-ba15-979962cfdfad/whereabouts-cni-bincopy/0.log" Apr 24 19:29:18.740919 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:18.740898 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qdfd7_2a4ef86a-6412-43fa-ba15-979962cfdfad/whereabouts-cni/0.log" Apr 24 19:29:18.768635 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:18.768603 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lv66n_39a410d7-8c61-49c0-8950-af91f35238f3/kube-multus/0.log" Apr 24 19:29:18.883308 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:18.883198 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f6x9g_c0ea34e5-a89a-4142-83d4-e94ef986bfa4/network-metrics-daemon/0.log" Apr 24 19:29:18.903499 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:18.903473 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f6x9g_c0ea34e5-a89a-4142-83d4-e94ef986bfa4/kube-rbac-proxy/0.log" Apr 24 19:29:20.328524 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:20.328490 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/ovn-controller/0.log" Apr 24 19:29:20.351398 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:20.351359 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/ovn-acl-logging/0.log" Apr 24 19:29:20.357578 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:20.357555 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/ovn-acl-logging/1.log" Apr 24 19:29:20.374965 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:20.374937 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/kube-rbac-proxy-node/0.log" Apr 24 19:29:20.394424 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:20.394392 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 19:29:20.413382 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:20.413349 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/northd/0.log" Apr 24 19:29:20.430892 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:20.430867 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/nbdb/0.log" Apr 24 19:29:20.452879 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:20.452851 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/sbdb/0.log" Apr 24 19:29:20.550457 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:20.550420 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b2ftq_c3b9dbc7-3f43-4d25-9375-c9c4859dd641/ovnkube-controller/0.log" Apr 24 19:29:21.516674 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:21.516640 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-n6v84_12b20576-da14-4ba1-926b-fed787f86bfb/network-check-target-container/0.log" Apr 24 19:29:22.397784 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:22.397755 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-xzp6g_3a2993c8-f0a2-46ca-be46-e34e78416219/iptables-alerter/0.log" Apr 24 19:29:23.023897 ip-10-0-137-23 kubenswrapper[2583]: I0424 19:29:23.023870 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-z56hw_c4078663-d8da-42c2-8049-c863b1b49ea9/tuned/0.log"