Apr 16 13:59:06.188943 ip-10-0-128-29 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:59:06.706579 ip-10-0-128-29 kubenswrapper[2582]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:06.706579 ip-10-0-128-29 kubenswrapper[2582]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:59:06.706579 ip-10-0-128-29 kubenswrapper[2582]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:06.706579 ip-10-0-128-29 kubenswrapper[2582]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:59:06.706579 ip-10-0-128-29 kubenswrapper[2582]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:06.711414 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.711321 2582 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:59:06.716297 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716281 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:06.716297 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716297 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716301 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716305 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716308 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716312 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716315 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716318 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716321 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716324 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716326 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716329 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716332 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716334 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716337 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716339 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716342 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716345 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716347 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716350 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716353 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:06.716360 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716355 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716363 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716367 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716371 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716376 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716378 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716381 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716383 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716386 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716389 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716391 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716394 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716397 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716399 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716402 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716405 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716409 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716413 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716416 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:06.716847 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716419 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716422 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716424 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716427 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716429 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716432 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716434 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716436 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716439 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716441 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716445 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716448 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716450 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716453 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716456 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716459 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716462 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716465 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716467 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716470 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:06.717320 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716473 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716476 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716478 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716482 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716485 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716487 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716490 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716492 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716495 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716497 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716500 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716503 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716506 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716508 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716510 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716513 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716516 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716518 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716521 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:06.717915 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716525 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716527 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716530 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716533 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716535 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716538 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716540 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716925 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716931 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716934 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716937 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716940 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716942 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716945 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716948 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716950 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716953 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716955 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716958 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:06.718408 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716961 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:06.718889 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716963 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:06.718889 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716966 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:06.718889 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716969 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:06.718889 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716971 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:06.718889 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716976 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:06.718889 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716979 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:06.718889 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716982 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:06.718889 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716985 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:06.718889 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716988 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:06.718889 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716991 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:06.718889 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716995 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:06.718889 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.716998 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:06.718889 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717001 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:06.718889 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717004 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:06.718889 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717007 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:06.718889 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717010 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:06.718889 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717012 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:06.718889 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717015 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717018 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717021 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717024 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717027 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717030 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717032 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717035 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717037 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717040 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717042 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717045 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717048 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717050 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717052 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717056 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717058 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717061 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717064 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717067 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:06.719339 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717070 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717073 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717075 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717078 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717081 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717084 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717087 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717090 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717093 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717096 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717098 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717100 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717103 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717105 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717108 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717110 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717113 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717115 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717118 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717121 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:06.719839 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717123 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717125 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717128 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717130 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717133 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717135 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717138 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717140 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717143 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717145 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717148 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717151 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717154 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717157 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717159 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.717162 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718189 2582 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718203 2582 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718209 2582 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718213 2582 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718218 2582 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:59:06.720329 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718221 2582 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718225 2582 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718230 2582 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718233 2582 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718236 2582 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718239 2582 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718243 2582 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718246 2582 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718249 2582 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718252 2582 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718255 2582 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718258 2582 flags.go:64] FLAG: --cloud-config="" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718261 2582 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718264 2582 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718270 2582 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718272 2582 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718276 2582 flags.go:64] FLAG: --config-dir="" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718279 2582 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718282 2582 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718290 2582 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718293 2582 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718297 2582 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718301 2582 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718304 2582 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:59:06.720847 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718308 2582 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718311 2582 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718314 2582 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718317 2582 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718322 2582 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718325 2582 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718328 2582 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718331 2582 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718334 2582 flags.go:64] FLAG: --enable-server="true" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718337 2582 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718341 2582 flags.go:64] FLAG: --event-burst="100" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718345 2582 flags.go:64] FLAG: --event-qps="50" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718348 2582 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718351 2582 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718354 2582 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718358 2582 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718361 2582 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718365 2582 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718368 2582 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718371 2582 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718373 2582 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718376 2582 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718379 2582 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718382 2582 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718385 2582 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:59:06.721415 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718388 2582 flags.go:64] FLAG: --feature-gates="" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718392 2582 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718396 2582 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718400 2582 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718404 2582 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718407 2582 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718411 2582 flags.go:64] FLAG: --help="false" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718415 2582 flags.go:64] FLAG: --hostname-override="ip-10-0-128-29.ec2.internal" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718419 2582 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718422 2582 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718425 2582 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718428 2582 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718432 2582 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718435 2582 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718438 2582 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718441 2582 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718444 2582 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718447 2582 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718450 2582 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718453 2582 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718456 2582 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718459 2582 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718463 2582 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718465 2582 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:59:06.722059 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718468 2582 flags.go:64] FLAG: --lock-file="" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718471 2582 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718474 2582 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718477 2582 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718483 2582 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718486 2582 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718489 2582 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718492 2582 flags.go:64] FLAG: --logging-format="text" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718495 2582 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718499 2582 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718503 2582 flags.go:64] FLAG: --manifest-url="" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718507 2582 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718511 2582 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718514 2582 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718519 2582 flags.go:64] FLAG: --max-pods="110" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718522 2582 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718526 2582 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718529 2582 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718532 2582 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718535 2582 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718538 2582 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718541 2582 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718548 2582 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718552 2582 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718555 2582 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:59:06.722634 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718558 2582 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718561 2582 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718567 2582 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718570 2582 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718574 2582 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718577 2582 flags.go:64] FLAG: --port="10250" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718580 2582 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718582 2582 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f3d07822471d9fc2" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718586 2582 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718589 2582 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718592 2582 flags.go:64] FLAG: --register-node="true" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718595 2582 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718598 2582 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718601 2582 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718605 2582 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718608 2582 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718610 2582 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718614 2582 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718619 2582 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718622 2582 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718625 2582 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718628 2582 flags.go:64] FLAG: --runonce="false" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718631 2582 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718634 2582 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718637 2582 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:59:06.723264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718640 2582 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718643 2582 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718646 2582 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718649 2582 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718652 2582 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718655 2582 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718658 2582 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718661 2582 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718664 2582 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718667 2582 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718670 2582 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718673 2582 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718678 2582 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718691 2582 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718694 2582 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718698 2582 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718701 2582 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718704 2582 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718707 2582 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718710 2582 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718713 2582 flags.go:64] FLAG: --v="2" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718717 2582 flags.go:64] FLAG: --version="false" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718721 2582 flags.go:64] FLAG: --vmodule="" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718726 2582 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.718729 2582 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:59:06.723899 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718824 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718827 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718830 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718833 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718836 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718839 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718842 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718844 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718847 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718850 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718852 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718855 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718857 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718860 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718863 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718865 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718868 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718871 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718874 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718876 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:06.724493 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718879 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718883 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718887 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718890 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718892 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718895 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718898 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718901 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718904 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718906 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718909 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718912 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718915 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718918 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718921 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718923 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718926 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718929 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718931 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718935 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:06.725023 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718937 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718940 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718943 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718946 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718948 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718951 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718954 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718956 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718959 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718962 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718964 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718967 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718969 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718972 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718974 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718978 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718982 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718985 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718988 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718991 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:06.725517 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718994 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718996 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.718999 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719002 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719006 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719009 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719011 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719014 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719016 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719019 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719022 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719024 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719027 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719031 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719034 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719037 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719039 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719042 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719044 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:06.726097 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719047 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:06.726579 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719050 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:06.726579 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719052 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:06.726579 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719055 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:06.726579 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719058 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:06.726579 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719060 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:06.726579 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.719063 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:06.726579 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.719751 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:06.726997 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.726979 2582 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:59:06.727027 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.726997 2582 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:59:06.727059 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727041 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:06.727059 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727045 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:06.727059 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727049 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:06.727059 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727052 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:06.727059 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727055 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:06.727059 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727057 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:06.727059 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727060 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:06.727059 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727063 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727066 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727069 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727072 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727074 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727077 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727080 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727082 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727085 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727087 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727090 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727093 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727096 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727098 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727101 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727103 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727106 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727109 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727111 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727113 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:06.727265 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727116 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727118 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727121 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727124 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727127 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727129 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727132 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727134 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727137 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727139 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727142 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727144 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727147 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727149 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727153 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727155 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727158 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727161 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727164 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:06.727825 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727166 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727169 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727171 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727176 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727180 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727183 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727186 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727189 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727191 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727194 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727196 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727199 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727201 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727205 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727208 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727211 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727214 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727217 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727220 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:06.728299 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727222 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727225 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727228 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727231 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727233 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727236 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727238 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727241 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727244 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727247 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727250 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727252 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727255 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727257 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727260 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727262 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727265 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727268 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727271 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727273 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:06.728788 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727276 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:06.729264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.727281 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:06.729264 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727379 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:06.729264 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727385 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:06.729264 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727389 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:06.729264 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727393 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:06.729264 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727396 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:06.729264 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727399 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:06.729264 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727402 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:06.729264 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727404 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:06.729264 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727407 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:06.729264 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727410 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:06.729264 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727413 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:06.729264 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727415 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:06.729264 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727418 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:06.729264 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727420 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727423 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727425 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727428 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727431 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727434 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727436 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727439 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727442 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727444 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727447 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727450 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727452 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727455 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727457 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727460 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727463 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727465 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727468 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:06.729634 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727470 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727473 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727475 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727478 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727480 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727483 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727485 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727488 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727490 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727493 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727496 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727499 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727501 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727504 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727506 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727508 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727511 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727513 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727516 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727519 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:06.730135 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727521 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727524 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727526 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727529 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727531 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727534 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727536 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727539 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727541 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727544 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727546 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727549 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727552 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727554 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727556 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727559 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727561 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727564 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727566 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727569 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:06.730627 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727571 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:06.731173 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727574 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:06.731173 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727577 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:06.731173 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727579 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:06.731173 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727587 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:06.731173 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727589 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:06.731173 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727592 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:06.731173 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727594 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:06.731173 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727597 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:06.731173 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727599 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:06.731173 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727603 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:06.731173 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727606 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:06.731173 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727609 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:06.731173 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:06.727612 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:06.731173 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.727617 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:06.731173 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.728396 2582 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:59:06.731541 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.730723 2582 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:59:06.731836 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.731824 2582 server.go:1019] "Starting client certificate rotation" Apr 16 13:59:06.731940 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.731922 2582 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:06.731982 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.731963 2582 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:06.764110 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.764087 2582 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:06.767418 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.767284 2582 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:06.786537 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.786515 2582 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:59:06.792659 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.792636 2582 log.go:25] "Validated CRI v1 image API" Apr 16 13:59:06.793881 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.793861 2582 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:59:06.795051 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.795035 2582 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:06.799220 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.799200 2582 fs.go:135] Filesystem UUIDs: map[34c71382-13b0-4262-9b2b-e2f9d4466b35:/dev/nvme0n1p3 6a69db6c-7c63-4150-9eda-7b836ee5c3cb:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 13:59:06.799295 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.799218 2582 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:59:06.805537 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.805422 2582 manager.go:217] Machine: {Timestamp:2026-04-16 13:59:06.802942487 +0000 UTC m=+0.475843771 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3058060 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d1a581dff4d03d2f119c3a038cf9e SystemUUID:ec2d1a58-1dff-4d03-d2f1-19c3a038cf9e BootID:ad55f46c-5a2c-4cc9-a4ef-cec90401f641 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:fe:46:32:f7:09 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:fe:46:32:f7:09 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5a:59:bc:6c:09:54 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:59:06.805537 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.805528 2582 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:59:06.805661 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.805608 2582 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:59:06.806882 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.806858 2582 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:59:06.807021 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.806885 2582 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-29.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:59:06.807063 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.807030 2582 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:59:06.807063 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.807039 2582 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:59:06.807063 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.807052 2582 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:06.807841 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.807830 2582 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:06.808720 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.808710 2582 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:06.808843 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.808834 2582 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:59:06.811446 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.811437 2582 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:59:06.811480 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.811456 2582 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:59:06.811480 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.811468 2582 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:59:06.811480 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.811477 2582 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:59:06.811606 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.811487 2582 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:59:06.812773 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.812747 2582 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:06.812773 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.812776 2582 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:06.816106 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.816087 2582 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:59:06.817750 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.817736 2582 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:59:06.819390 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.819372 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:59:06.819476 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.819397 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:59:06.819476 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.819404 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:59:06.819476 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.819411 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:59:06.819476 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.819416 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:59:06.819476 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.819422 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:59:06.819476 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.819428 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:59:06.819476 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.819435 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:59:06.819476 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.819441 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:59:06.819476 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.819447 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:59:06.819476 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.819468 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:59:06.819476 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.819478 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:59:06.820771 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.820760 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:59:06.820771 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.820771 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:59:06.824537 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.824521 2582 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:59:06.824597 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.824563 2582 server.go:1295] "Started kubelet" Apr 16 13:59:06.824671 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.824636 2582 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:59:06.824767 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.824723 2582 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:59:06.824822 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.824797 2582 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:59:06.825324 ip-10-0-128-29 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:59:06.828450 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:06.828260 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-29.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 13:59:06.828450 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.828376 2582 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:59:06.828450 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.828384 2582 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-29.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 13:59:06.828671 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:06.828589 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 13:59:06.830319 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.830294 2582 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:59:06.834446 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.834425 2582 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:06.834940 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.834923 2582 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:59:06.834940 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:06.834056 2582 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-29.ec2.internal.18a6db0bcff88a3b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-29.ec2.internal,UID:ip-10-0-128-29.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-29.ec2.internal,},FirstTimestamp:2026-04-16 13:59:06.824534587 +0000 UTC m=+0.497435871,LastTimestamp:2026-04-16 13:59:06.824534587 +0000 UTC m=+0.497435871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-29.ec2.internal,}" Apr 16 13:59:06.835624 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.835605 2582 factory.go:55] Registering systemd factory Apr 16 13:59:06.835734 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.835628 2582 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:59:06.835734 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.835606 2582 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:59:06.835734 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.835664 2582 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:59:06.835734 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.835607 2582 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:59:06.835734 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.835733 2582 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:59:06.835734 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.835740 2582 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:59:06.836031 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.835846 2582 factory.go:153] Registering CRI-O factory Apr 16 13:59:06.836031 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.835859 2582 factory.go:223] Registration of the crio container factory successfully Apr 16 13:59:06.836031 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:06.835855 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-29.ec2.internal\" not found" Apr 16 13:59:06.836031 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.835934 2582 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:59:06.836031 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.835959 2582 factory.go:103] Registering Raw factory Apr 16 13:59:06.836031 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.835974 2582 manager.go:1196] Started watching for new ooms in manager Apr 16 13:59:06.836286 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.836276 2582 manager.go:319] Starting recovery of all containers Apr 16 13:59:06.839011 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:06.838976 2582 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 13:59:06.840983 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.840960 2582 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-97xxw" Apr 16 13:59:06.842450 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:06.842251 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 13:59:06.842450 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:06.842309 2582 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-29.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 13:59:06.847448 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.847430 2582 manager.go:324] Recovery completed Apr 16 13:59:06.847582 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.847553 2582 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-97xxw" Apr 16 13:59:06.848976 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:06.848954 2582 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 13:59:06.851951 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.851932 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:06.854595 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.854573 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:06.854667 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.854608 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:06.854667 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.854617 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:06.855162 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.855146 2582 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:59:06.855224 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.855166 2582 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:59:06.855224 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.855182 2582 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:06.857289 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:06.857197 2582 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-29.ec2.internal.18a6db0bd1c33947 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-29.ec2.internal,UID:ip-10-0-128-29.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-29.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-29.ec2.internal,},FirstTimestamp:2026-04-16 13:59:06.854594887 +0000 UTC m=+0.527496172,LastTimestamp:2026-04-16 13:59:06.854594887 +0000 UTC m=+0.527496172,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-29.ec2.internal,}" Apr 16 13:59:06.857425 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.857412 2582 policy_none.go:49] "None policy: Start" Apr 16 13:59:06.857458 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.857446 2582 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:59:06.857458 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.857457 2582 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:59:06.893289 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.893274 2582 manager.go:341] "Starting Device Plugin manager" Apr 16 13:59:06.906582 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:06.893304 2582 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:59:06.906582 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.893313 2582 server.go:85] "Starting device plugin registration server" Apr 16 13:59:06.906582 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.893543 2582 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:59:06.906582 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.893556 2582 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:59:06.906582 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.893642 2582 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:59:06.906582 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.893735 2582 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:59:06.906582 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.893743 2582 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:59:06.906582 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:06.894171 2582 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:59:06.906582 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:06.894198 2582 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-29.ec2.internal\" not found" Apr 16 13:59:06.933296 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.933273 2582 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:59:06.934412 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.934398 2582 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:59:06.934500 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.934424 2582 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:59:06.934500 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.934444 2582 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:59:06.934500 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.934454 2582 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:59:06.934500 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:06.934491 2582 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:59:06.938111 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.938093 2582 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:06.994061 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.994015 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:06.995280 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.995265 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:06.995347 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.995293 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:06.995347 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.995304 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:06.995347 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:06.995333 2582 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.007419 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.007405 2582 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.007461 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:07.007426 2582 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-29.ec2.internal\": node \"ip-10-0-128-29.ec2.internal\" not found" Apr 16 13:59:07.035244 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.035224 2582 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-29.ec2.internal"] Apr 16 13:59:07.035309 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.035281 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:07.035989 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.035976 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:07.036053 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.036002 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:07.036053 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.036013 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:07.037450 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.037437 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:07.037578 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.037562 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.037618 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.037594 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:07.038070 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.038057 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:07.038070 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.038064 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:07.038163 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.038083 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:07.038163 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.038085 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:07.038163 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.038117 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:07.038163 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.038097 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:07.039887 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.039872 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.039940 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.039898 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:07.040564 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.040550 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:07.040650 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.040574 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:07.040650 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.040587 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:07.049630 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:07.049613 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-29.ec2.internal\" not found" Apr 16 13:59:07.074300 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:07.074281 2582 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-29.ec2.internal\" not found" node="ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.078650 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:07.078635 2582 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-29.ec2.internal\" not found" node="ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.136425 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.136409 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/eb8ae9f57d7d43deceabe76dd74043d6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal\" (UID: \"eb8ae9f57d7d43deceabe76dd74043d6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.136505 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.136434 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb8ae9f57d7d43deceabe76dd74043d6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal\" (UID: \"eb8ae9f57d7d43deceabe76dd74043d6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.150521 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:07.150502 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-29.ec2.internal\" not found" Apr 16 13:59:07.236841 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.236810 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/80eaaec527b04d922efdac38ef2d0c20-config\") pod \"kube-apiserver-proxy-ip-10-0-128-29.ec2.internal\" (UID: \"80eaaec527b04d922efdac38ef2d0c20\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.236940 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.236870 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/eb8ae9f57d7d43deceabe76dd74043d6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal\" (UID: \"eb8ae9f57d7d43deceabe76dd74043d6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.236940 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.236897 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb8ae9f57d7d43deceabe76dd74043d6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal\" (UID: \"eb8ae9f57d7d43deceabe76dd74043d6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.236940 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.236933 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb8ae9f57d7d43deceabe76dd74043d6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal\" (UID: \"eb8ae9f57d7d43deceabe76dd74043d6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.237046 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.236978 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/eb8ae9f57d7d43deceabe76dd74043d6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal\" (UID: \"eb8ae9f57d7d43deceabe76dd74043d6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.250996 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:07.250933 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-29.ec2.internal\" not found" Apr 16 13:59:07.337589 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.337549 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/80eaaec527b04d922efdac38ef2d0c20-config\") pod \"kube-apiserver-proxy-ip-10-0-128-29.ec2.internal\" (UID: \"80eaaec527b04d922efdac38ef2d0c20\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.337752 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.337624 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/80eaaec527b04d922efdac38ef2d0c20-config\") pod \"kube-apiserver-proxy-ip-10-0-128-29.ec2.internal\" (UID: \"80eaaec527b04d922efdac38ef2d0c20\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.351671 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:07.351647 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-29.ec2.internal\" not found" Apr 16 13:59:07.378827 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.378808 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.381315 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.381300 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.452767 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:07.452739 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-29.ec2.internal\" not found" Apr 16 13:59:07.553225 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:07.553176 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-29.ec2.internal\" not found" Apr 16 13:59:07.653776 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:07.653729 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-29.ec2.internal\" not found" Apr 16 13:59:07.732008 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.731965 2582 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:59:07.732553 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.732149 2582 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:07.754367 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:07.754350 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-29.ec2.internal\" not found" Apr 16 13:59:07.835088 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.835039 2582 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:07.844250 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.844227 2582 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:07.850357 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.850328 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:54:06 +0000 UTC" deadline="2027-11-03 23:39:44.784822943 +0000 UTC" Apr 16 13:59:07.850436 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.850357 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13593h40m36.934469191s" Apr 16 13:59:07.854425 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:07.854406 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-29.ec2.internal\" not found" Apr 16 13:59:07.858223 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.858210 2582 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:07.919534 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.919515 2582 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-przbn" Apr 16 13:59:07.929072 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.929055 2582 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-przbn" Apr 16 13:59:07.935945 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.935927 2582 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.952008 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.951990 2582 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:07.953967 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.953954 2582 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-29.ec2.internal" Apr 16 13:59:07.961518 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:07.961503 2582 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:08.036264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.036244 2582 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:08.291442 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.291421 2582 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:08.333988 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:08.333948 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80eaaec527b04d922efdac38ef2d0c20.slice/crio-13766a32307c5140b28d1641ad5fd70e32465962e38fe688ecefb314f4da6bc8 WatchSource:0}: Error finding container 13766a32307c5140b28d1641ad5fd70e32465962e38fe688ecefb314f4da6bc8: Status 404 returned error can't find the container with id 13766a32307c5140b28d1641ad5fd70e32465962e38fe688ecefb314f4da6bc8 Apr 16 13:59:08.337942 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.337926 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:59:08.583567 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:08.583543 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb8ae9f57d7d43deceabe76dd74043d6.slice/crio-95c23356ed90343c7f4c83ea432749aab901138569b00287b5d856fe5d1fa1d0 WatchSource:0}: Error finding container 95c23356ed90343c7f4c83ea432749aab901138569b00287b5d856fe5d1fa1d0: Status 404 returned error can't find the container with id 95c23356ed90343c7f4c83ea432749aab901138569b00287b5d856fe5d1fa1d0 Apr 16 13:59:08.812400 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.812367 2582 apiserver.go:52] "Watching apiserver" Apr 16 13:59:08.816936 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.816916 2582 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:59:08.817281 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.817261 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-pq9ch","kube-system/global-pull-secret-syncer-hxjfz","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg","openshift-image-registry/node-ca-sqfgf","openshift-multus/multus-additional-cni-plugins-pjcmk","openshift-multus/network-metrics-daemon-rt77p","openshift-network-diagnostics/network-check-target-qpb5w","openshift-network-operator/iptables-alerter-k697d","openshift-ovn-kubernetes/ovnkube-node-np5bh","kube-system/konnectivity-agent-flg9b","kube-system/kube-apiserver-proxy-ip-10-0-128-29.ec2.internal","openshift-cluster-node-tuning-operator/tuned-ff2qp","openshift-dns/node-resolver-ztqms","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal"] Apr 16 13:59:08.820613 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.820590 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.821666 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.821645 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:08.821778 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:08.821746 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxjfz" podUID="80f4a0f5-8232-4155-a115-e7470360cc63" Apr 16 13:59:08.822518 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.822500 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:59:08.822621 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.822548 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-5kxlr\"" Apr 16 13:59:08.822621 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.822569 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:59:08.822621 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.822501 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:59:08.822621 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.822505 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:59:08.822876 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.822705 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.822876 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.822807 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sqfgf" Apr 16 13:59:08.823933 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.823908 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.824431 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.824396 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:59:08.824843 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.824572 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-lrkm6\"" Apr 16 13:59:08.824843 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.824614 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:59:08.824843 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.824641 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:59:08.824843 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.824664 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:59:08.824843 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.824697 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:59:08.824843 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.824760 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5rgm5\"" Apr 16 13:59:08.824843 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.824764 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:59:08.825325 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.825268 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:08.825376 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:08.825320 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rt77p" podUID="86d416f7-1028-4d19-9a65-2ecc6960eeb7" Apr 16 13:59:08.825427 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.825397 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:59:08.825427 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.825399 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:59:08.825630 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.825615 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-m2ddc\"" Apr 16 13:59:08.826347 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.826329 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:08.826419 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:08.826401 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qpb5w" podUID="a0c4f1c8-43b6-4596-a619-0dd4cba798af" Apr 16 13:59:08.827536 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.827519 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-k697d" Apr 16 13:59:08.828883 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.828865 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.829098 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.829076 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:08.829516 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.829501 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:59:08.829594 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.829543 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-htdc2\"" Apr 16 13:59:08.829594 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.829566 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:08.829987 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.829972 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-flg9b" Apr 16 13:59:08.831272 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.830601 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:59:08.831927 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.831509 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-cg5wp\"" Apr 16 13:59:08.831927 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.831521 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:59:08.831927 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.831745 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:59:08.832352 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.832332 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:59:08.832531 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.832515 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:59:08.832626 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.832542 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:59:08.832720 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.832704 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-lmmnh\"" Apr 16 13:59:08.832892 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.832795 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:59:08.832892 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.832573 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:59:08.833667 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.833645 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ztqms" Apr 16 13:59:08.833782 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.833674 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.835382 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.835365 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:59:08.835643 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.835627 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:59:08.835740 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.835657 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:08.835740 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.835631 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-mph7k\"" Apr 16 13:59:08.835740 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.835656 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:08.835897 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.835627 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cqpdx\"" Apr 16 13:59:08.836951 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.836935 2582 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:59:08.844805 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.844788 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-os-release\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.844887 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.844810 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg47x\" (UniqueName: \"kubernetes.io/projected/42370dc2-2d36-49c1-b178-6763d784a3e0-kube-api-access-kg47x\") pod \"iptables-alerter-k697d\" (UID: \"42370dc2-2d36-49c1-b178-6763d784a3e0\") " pod="openshift-network-operator/iptables-alerter-k697d" Apr 16 13:59:08.844887 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.844830 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-sys\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.844887 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.844844 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-lib-modules\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.844887 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.844859 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-tuned\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.844887 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.844879 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-tmp\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.845071 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.844900 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-os-release\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.845071 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.844917 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-host-var-lib-cni-multus\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.845071 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.844941 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9hhs\" (UniqueName: \"kubernetes.io/projected/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-kube-api-access-b9hhs\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.845071 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.844955 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs\") pod \"network-metrics-daemon-rt77p\" (UID: \"86d416f7-1028-4d19-9a65-2ecc6960eeb7\") " pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:08.845071 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.844988 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/03bde64a-28ea-4a04-a03d-543688a5f10e-device-dir\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.845071 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845012 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/42370dc2-2d36-49c1-b178-6763d784a3e0-host-slash\") pod \"iptables-alerter-k697d\" (UID: \"42370dc2-2d36-49c1-b178-6763d784a3e0\") " pod="openshift-network-operator/iptables-alerter-k697d" Apr 16 13:59:08.845071 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845028 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-env-overrides\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.845071 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845044 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97gnc\" (UniqueName: \"kubernetes.io/projected/1c601f7b-2758-4b61-a47e-bdc41ba6fb31-kube-api-access-97gnc\") pod \"node-resolver-ztqms\" (UID: \"1c601f7b-2758-4b61-a47e-bdc41ba6fb31\") " pod="openshift-dns/node-resolver-ztqms" Apr 16 13:59:08.845071 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845064 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-host-var-lib-kubelet\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.845458 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845085 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-multus-conf-dir\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.845458 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845107 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-sysctl-d\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.845458 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845150 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-kubelet\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.845458 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845202 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-run-ovn\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.845458 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845229 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-log-socket\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.845458 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845254 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-ovnkube-config\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.845458 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845278 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-cnibin\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.845458 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845302 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-cni-binary-copy\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.845458 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845329 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n67l\" (UniqueName: \"kubernetes.io/projected/86d416f7-1028-4d19-9a65-2ecc6960eeb7-kube-api-access-5n67l\") pod \"network-metrics-daemon-rt77p\" (UID: \"86d416f7-1028-4d19-9a65-2ecc6960eeb7\") " pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:08.845458 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845352 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqxst\" (UniqueName: \"kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst\") pod \"network-check-target-qpb5w\" (UID: \"a0c4f1c8-43b6-4596-a619-0dd4cba798af\") " pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:08.845458 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845377 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/42370dc2-2d36-49c1-b178-6763d784a3e0-iptables-alerter-script\") pod \"iptables-alerter-k697d\" (UID: \"42370dc2-2d36-49c1-b178-6763d784a3e0\") " pod="openshift-network-operator/iptables-alerter-k697d" Apr 16 13:59:08.845458 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845400 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-systemd-units\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.845458 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845426 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-node-log\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.845949 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845469 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-cni-bin\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.845949 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845488 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.845949 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845502 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-cni-netd\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.845949 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845515 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-host-var-lib-cni-bin\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.845949 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845534 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-hostroot\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.845949 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845562 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56fe5eb2-ae67-4d8a-a719-f51bf68da0d0-host\") pod \"node-ca-sqfgf\" (UID: \"56fe5eb2-ae67-4d8a-a719-f51bf68da0d0\") " pod="openshift-image-registry/node-ca-sqfgf" Apr 16 13:59:08.845949 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845579 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/56fe5eb2-ae67-4d8a-a719-f51bf68da0d0-serviceca\") pod \"node-ca-sqfgf\" (UID: \"56fe5eb2-ae67-4d8a-a719-f51bf68da0d0\") " pod="openshift-image-registry/node-ca-sqfgf" Apr 16 13:59:08.845949 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845593 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/03bde64a-28ea-4a04-a03d-543688a5f10e-sys-fs\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.845949 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845613 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkrxx\" (UniqueName: \"kubernetes.io/projected/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-kube-api-access-wkrxx\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.845949 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845633 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/083b5510-48e0-4313-a2e7-fca5271e9e0f-agent-certs\") pod \"konnectivity-agent-flg9b\" (UID: \"083b5510-48e0-4313-a2e7-fca5271e9e0f\") " pod="kube-system/konnectivity-agent-flg9b" Apr 16 13:59:08.845949 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845647 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-sysctl-conf\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.845949 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845665 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1c601f7b-2758-4b61-a47e-bdc41ba6fb31-hosts-file\") pod \"node-resolver-ztqms\" (UID: \"1c601f7b-2758-4b61-a47e-bdc41ba6fb31\") " pod="openshift-dns/node-resolver-ztqms" Apr 16 13:59:08.845949 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845695 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlgds\" (UniqueName: \"kubernetes.io/projected/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-kube-api-access-xlgds\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.845949 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845720 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/03bde64a-28ea-4a04-a03d-543688a5f10e-registration-dir\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.845949 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845740 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-sysconfig\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.845949 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845753 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-etc-kubernetes\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.845949 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845777 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/80f4a0f5-8232-4155-a115-e7470360cc63-kubelet-config\") pod \"global-pull-secret-syncer-hxjfz\" (UID: \"80f4a0f5-8232-4155-a115-e7470360cc63\") " pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:08.846611 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845802 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7gm4\" (UniqueName: \"kubernetes.io/projected/56fe5eb2-ae67-4d8a-a719-f51bf68da0d0-kube-api-access-c7gm4\") pod \"node-ca-sqfgf\" (UID: \"56fe5eb2-ae67-4d8a-a719-f51bf68da0d0\") " pod="openshift-image-registry/node-ca-sqfgf" Apr 16 13:59:08.846611 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845816 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/03bde64a-28ea-4a04-a03d-543688a5f10e-socket-dir\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.846611 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845838 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-var-lib-openvswitch\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.846611 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845860 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.846611 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845877 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-host-run-k8s-cni-cncf-io\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.846611 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845895 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-system-cni-dir\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.846611 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845911 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-systemd\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.846611 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845933 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-system-cni-dir\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.846611 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845961 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-multus-socket-dir-parent\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.846611 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.845980 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-multus-daemon-config\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.846611 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846003 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-run-netns\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.846611 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846026 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret\") pod \"global-pull-secret-syncer-hxjfz\" (UID: \"80f4a0f5-8232-4155-a115-e7470360cc63\") " pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:08.846611 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846042 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-run\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.846611 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846058 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-run-systemd\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.846611 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846071 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-etc-openvswitch\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.846611 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846110 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-multus-cni-dir\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.847229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846142 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-cni-binary-copy\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.847229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846166 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-cnibin\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.847229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846190 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-ovn-node-metrics-cert\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.847229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846213 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-ovnkube-script-lib\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.847229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846235 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-modprobe-d\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.847229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846258 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-var-lib-kubelet\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.847229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846288 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1c601f7b-2758-4b61-a47e-bdc41ba6fb31-tmp-dir\") pod \"node-resolver-ztqms\" (UID: \"1c601f7b-2758-4b61-a47e-bdc41ba6fb31\") " pod="openshift-dns/node-resolver-ztqms" Apr 16 13:59:08.847229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846311 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-host-run-netns\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.847229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846330 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.847229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846352 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03bde64a-28ea-4a04-a03d-543688a5f10e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.847229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846367 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-slash\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.847229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846400 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-run-ovn-kubernetes\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.847229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846421 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-kubernetes\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.847229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846451 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-host-run-multus-certs\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.847229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846466 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.847229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846479 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/03bde64a-28ea-4a04-a03d-543688a5f10e-etc-selinux\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.847710 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846493 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-run-openvswitch\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.847710 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846508 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/083b5510-48e0-4313-a2e7-fca5271e9e0f-konnectivity-ca\") pod \"konnectivity-agent-flg9b\" (UID: \"083b5510-48e0-4313-a2e7-fca5271e9e0f\") " pod="kube-system/konnectivity-agent-flg9b" Apr 16 13:59:08.847710 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846546 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tcjl\" (UniqueName: \"kubernetes.io/projected/03bde64a-28ea-4a04-a03d-543688a5f10e-kube-api-access-9tcjl\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.847710 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846567 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-host\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.847710 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846590 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hkc5\" (UniqueName: \"kubernetes.io/projected/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-kube-api-access-8hkc5\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.847710 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.846632 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/80f4a0f5-8232-4155-a115-e7470360cc63-dbus\") pod \"global-pull-secret-syncer-hxjfz\" (UID: \"80f4a0f5-8232-4155-a115-e7470360cc63\") " pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:08.931766 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.931731 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:07 +0000 UTC" deadline="2027-09-13 06:06:03.027146323 +0000 UTC" Apr 16 13:59:08.931766 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.931763 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12352h6m54.095387094s" Apr 16 13:59:08.938121 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.938083 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal" event={"ID":"eb8ae9f57d7d43deceabe76dd74043d6","Type":"ContainerStarted","Data":"95c23356ed90343c7f4c83ea432749aab901138569b00287b5d856fe5d1fa1d0"} Apr 16 13:59:08.941396 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.941368 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-29.ec2.internal" event={"ID":"80eaaec527b04d922efdac38ef2d0c20","Type":"ContainerStarted","Data":"13766a32307c5140b28d1641ad5fd70e32465962e38fe688ecefb314f4da6bc8"} Apr 16 13:59:08.947738 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.947720 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.947806 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.947746 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-cni-netd\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.947806 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.947762 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-host-var-lib-cni-bin\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.947806 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.947776 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-hostroot\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.947806 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.947791 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56fe5eb2-ae67-4d8a-a719-f51bf68da0d0-host\") pod \"node-ca-sqfgf\" (UID: \"56fe5eb2-ae67-4d8a-a719-f51bf68da0d0\") " pod="openshift-image-registry/node-ca-sqfgf" Apr 16 13:59:08.947998 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.947808 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/56fe5eb2-ae67-4d8a-a719-f51bf68da0d0-serviceca\") pod \"node-ca-sqfgf\" (UID: \"56fe5eb2-ae67-4d8a-a719-f51bf68da0d0\") " pod="openshift-image-registry/node-ca-sqfgf" Apr 16 13:59:08.947998 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.947831 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/03bde64a-28ea-4a04-a03d-543688a5f10e-sys-fs\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.947998 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.947835 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-cni-netd\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.947998 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.947855 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkrxx\" (UniqueName: \"kubernetes.io/projected/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-kube-api-access-wkrxx\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.947998 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.947864 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-host-var-lib-cni-bin\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.947998 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.947881 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/083b5510-48e0-4313-a2e7-fca5271e9e0f-agent-certs\") pod \"konnectivity-agent-flg9b\" (UID: \"083b5510-48e0-4313-a2e7-fca5271e9e0f\") " pod="kube-system/konnectivity-agent-flg9b" Apr 16 13:59:08.947998 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.947906 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-sysctl-conf\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.947998 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.947908 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56fe5eb2-ae67-4d8a-a719-f51bf68da0d0-host\") pod \"node-ca-sqfgf\" (UID: \"56fe5eb2-ae67-4d8a-a719-f51bf68da0d0\") " pod="openshift-image-registry/node-ca-sqfgf" Apr 16 13:59:08.947998 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.947854 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-hostroot\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.947998 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.947928 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1c601f7b-2758-4b61-a47e-bdc41ba6fb31-hosts-file\") pod \"node-resolver-ztqms\" (UID: \"1c601f7b-2758-4b61-a47e-bdc41ba6fb31\") " pod="openshift-dns/node-resolver-ztqms" Apr 16 13:59:08.948436 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948024 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlgds\" (UniqueName: \"kubernetes.io/projected/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-kube-api-access-xlgds\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.948436 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948039 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1c601f7b-2758-4b61-a47e-bdc41ba6fb31-hosts-file\") pod \"node-resolver-ztqms\" (UID: \"1c601f7b-2758-4b61-a47e-bdc41ba6fb31\") " pod="openshift-dns/node-resolver-ztqms" Apr 16 13:59:08.948436 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948089 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/03bde64a-28ea-4a04-a03d-543688a5f10e-sys-fs\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.948436 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948141 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-sysctl-conf\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.948436 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948189 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/03bde64a-28ea-4a04-a03d-543688a5f10e-registration-dir\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.948436 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948215 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-sysconfig\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.948436 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948235 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-etc-kubernetes\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.948436 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948258 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/80f4a0f5-8232-4155-a115-e7470360cc63-kubelet-config\") pod \"global-pull-secret-syncer-hxjfz\" (UID: \"80f4a0f5-8232-4155-a115-e7470360cc63\") " pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:08.948436 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948258 2582 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:59:08.948436 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948286 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7gm4\" (UniqueName: \"kubernetes.io/projected/56fe5eb2-ae67-4d8a-a719-f51bf68da0d0-kube-api-access-c7gm4\") pod \"node-ca-sqfgf\" (UID: \"56fe5eb2-ae67-4d8a-a719-f51bf68da0d0\") " pod="openshift-image-registry/node-ca-sqfgf" Apr 16 13:59:08.948436 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948295 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-etc-kubernetes\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.948436 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948311 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/03bde64a-28ea-4a04-a03d-543688a5f10e-socket-dir\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.948436 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948298 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/03bde64a-28ea-4a04-a03d-543688a5f10e-registration-dir\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.948436 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948325 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/80f4a0f5-8232-4155-a115-e7470360cc63-kubelet-config\") pod \"global-pull-secret-syncer-hxjfz\" (UID: \"80f4a0f5-8232-4155-a115-e7470360cc63\") " pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:08.948436 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948336 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-var-lib-openvswitch\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.948436 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948288 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-sysconfig\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.948436 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948363 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.949055 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948384 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-var-lib-openvswitch\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.949055 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948389 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-host-run-k8s-cni-cncf-io\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.949055 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948416 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-system-cni-dir\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.949055 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948427 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.949055 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948418 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.949055 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948441 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-systemd\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.949055 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948413 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/03bde64a-28ea-4a04-a03d-543688a5f10e-socket-dir\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.949055 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948455 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-host-run-k8s-cni-cncf-io\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.949055 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948465 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-system-cni-dir\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.949055 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948472 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-system-cni-dir\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.949055 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948489 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-multus-socket-dir-parent\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.949055 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948518 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-multus-daemon-config\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.949055 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948526 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-system-cni-dir\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.949055 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948499 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-systemd\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.949055 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948543 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-run-netns\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.949055 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948560 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-multus-socket-dir-parent\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.949055 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948577 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret\") pod \"global-pull-secret-syncer-hxjfz\" (UID: \"80f4a0f5-8232-4155-a115-e7470360cc63\") " pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:08.949826 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948594 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-run-netns\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.949826 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948601 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-run\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.949826 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948639 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-run\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.949826 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948647 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-run-systemd\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.949826 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948674 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-etc-openvswitch\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.949826 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948696 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-run-systemd\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.949826 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948711 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-multus-cni-dir\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.949826 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:08.948720 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:08.949826 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948733 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-etc-openvswitch\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.949826 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948734 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-cni-binary-copy\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.949826 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948772 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-cnibin\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.949826 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948789 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-multus-cni-dir\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.949826 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:08.948796 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret podName:80f4a0f5-8232-4155-a115-e7470360cc63 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:09.448765698 +0000 UTC m=+3.121666981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret") pod "global-pull-secret-syncer-hxjfz" (UID: "80f4a0f5-8232-4155-a115-e7470360cc63") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:08.949826 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948804 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-cnibin\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.949826 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948824 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-ovn-node-metrics-cert\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.949826 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948845 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/56fe5eb2-ae67-4d8a-a719-f51bf68da0d0-serviceca\") pod \"node-ca-sqfgf\" (UID: \"56fe5eb2-ae67-4d8a-a719-f51bf68da0d0\") " pod="openshift-image-registry/node-ca-sqfgf" Apr 16 13:59:08.949826 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948851 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-ovnkube-script-lib\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.950575 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948904 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-modprobe-d\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.950575 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948926 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-var-lib-kubelet\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.950575 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948947 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1c601f7b-2758-4b61-a47e-bdc41ba6fb31-tmp-dir\") pod \"node-resolver-ztqms\" (UID: \"1c601f7b-2758-4b61-a47e-bdc41ba6fb31\") " pod="openshift-dns/node-resolver-ztqms" Apr 16 13:59:08.950575 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948966 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-host-run-netns\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.950575 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.948986 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.950575 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949010 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03bde64a-28ea-4a04-a03d-543688a5f10e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.950575 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949031 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-slash\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.950575 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949047 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-modprobe-d\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.950575 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949049 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-run-ovn-kubernetes\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.950575 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949078 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-run-ovn-kubernetes\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.950575 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949090 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-kubernetes\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.950575 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949117 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-var-lib-kubelet\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.950575 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949117 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-host-run-multus-certs\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.950575 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949148 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-host-run-multus-certs\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.950575 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949151 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.950575 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949155 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-multus-daemon-config\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.950575 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949156 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-kubernetes\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.951307 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949180 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/03bde64a-28ea-4a04-a03d-543688a5f10e-etc-selinux\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.951307 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949204 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-run-openvswitch\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.951307 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949243 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-run-openvswitch\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.951307 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949275 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-cni-binary-copy\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.951307 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949306 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/03bde64a-28ea-4a04-a03d-543688a5f10e-etc-selinux\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.951307 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949338 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/083b5510-48e0-4313-a2e7-fca5271e9e0f-konnectivity-ca\") pod \"konnectivity-agent-flg9b\" (UID: \"083b5510-48e0-4313-a2e7-fca5271e9e0f\") " pod="kube-system/konnectivity-agent-flg9b" Apr 16 13:59:08.951307 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949417 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tcjl\" (UniqueName: \"kubernetes.io/projected/03bde64a-28ea-4a04-a03d-543688a5f10e-kube-api-access-9tcjl\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.951307 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949444 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-host\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.951307 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949467 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hkc5\" (UniqueName: \"kubernetes.io/projected/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-kube-api-access-8hkc5\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.951307 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949489 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/80f4a0f5-8232-4155-a115-e7470360cc63-dbus\") pod \"global-pull-secret-syncer-hxjfz\" (UID: \"80f4a0f5-8232-4155-a115-e7470360cc63\") " pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:08.951307 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949513 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-host\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.951307 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949513 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-os-release\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.951307 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949551 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kg47x\" (UniqueName: \"kubernetes.io/projected/42370dc2-2d36-49c1-b178-6763d784a3e0-kube-api-access-kg47x\") pod \"iptables-alerter-k697d\" (UID: \"42370dc2-2d36-49c1-b178-6763d784a3e0\") " pod="openshift-network-operator/iptables-alerter-k697d" Apr 16 13:59:08.951307 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949568 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03bde64a-28ea-4a04-a03d-543688a5f10e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.951307 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949574 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-sys\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.951307 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949610 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-host-run-netns\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.951307 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949621 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-sys\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.951904 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949638 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-lib-modules\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.951904 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949760 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-lib-modules\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.951904 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949467 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1c601f7b-2758-4b61-a47e-bdc41ba6fb31-tmp-dir\") pod \"node-resolver-ztqms\" (UID: \"1c601f7b-2758-4b61-a47e-bdc41ba6fb31\") " pod="openshift-dns/node-resolver-ztqms" Apr 16 13:59:08.951904 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949866 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-ovnkube-script-lib\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.951904 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949946 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/80f4a0f5-8232-4155-a115-e7470360cc63-dbus\") pod \"global-pull-secret-syncer-hxjfz\" (UID: \"80f4a0f5-8232-4155-a115-e7470360cc63\") " pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:08.951904 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949947 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/083b5510-48e0-4313-a2e7-fca5271e9e0f-konnectivity-ca\") pod \"konnectivity-agent-flg9b\" (UID: \"083b5510-48e0-4313-a2e7-fca5271e9e0f\") " pod="kube-system/konnectivity-agent-flg9b" Apr 16 13:59:08.951904 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949963 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.951904 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.949976 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-tuned\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.951904 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950019 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-os-release\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.951904 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950035 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-tmp\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.951904 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950058 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-slash\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.951904 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950061 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-os-release\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.951904 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950086 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-host-var-lib-cni-multus\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.951904 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950124 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-os-release\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.951904 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950118 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9hhs\" (UniqueName: \"kubernetes.io/projected/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-kube-api-access-b9hhs\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.951904 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950161 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs\") pod \"network-metrics-daemon-rt77p\" (UID: \"86d416f7-1028-4d19-9a65-2ecc6960eeb7\") " pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:08.951904 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950171 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-host-var-lib-cni-multus\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.951904 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950184 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/03bde64a-28ea-4a04-a03d-543688a5f10e-device-dir\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.952411 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950207 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/42370dc2-2d36-49c1-b178-6763d784a3e0-host-slash\") pod \"iptables-alerter-k697d\" (UID: \"42370dc2-2d36-49c1-b178-6763d784a3e0\") " pod="openshift-network-operator/iptables-alerter-k697d" Apr 16 13:59:08.952411 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950247 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-env-overrides\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.952411 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950296 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/03bde64a-28ea-4a04-a03d-543688a5f10e-device-dir\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.952411 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:08.950410 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:08.952411 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950417 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.952411 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950421 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/42370dc2-2d36-49c1-b178-6763d784a3e0-host-slash\") pod \"iptables-alerter-k697d\" (UID: \"42370dc2-2d36-49c1-b178-6763d784a3e0\") " pod="openshift-network-operator/iptables-alerter-k697d" Apr 16 13:59:08.952411 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:08.950456 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs podName:86d416f7-1028-4d19-9a65-2ecc6960eeb7 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:09.450441185 +0000 UTC m=+3.123342470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs") pod "network-metrics-daemon-rt77p" (UID: "86d416f7-1028-4d19-9a65-2ecc6960eeb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:08.952411 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950503 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97gnc\" (UniqueName: \"kubernetes.io/projected/1c601f7b-2758-4b61-a47e-bdc41ba6fb31-kube-api-access-97gnc\") pod \"node-resolver-ztqms\" (UID: \"1c601f7b-2758-4b61-a47e-bdc41ba6fb31\") " pod="openshift-dns/node-resolver-ztqms" Apr 16 13:59:08.952411 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950569 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-host-var-lib-kubelet\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.952411 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950597 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-multus-conf-dir\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.952411 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950621 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-sysctl-d\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.952411 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950662 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-host-var-lib-kubelet\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.952411 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950663 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-kubelet\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.952411 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950711 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-env-overrides\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.952411 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950767 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-run-ovn\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.952411 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950728 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-multus-conf-dir\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.952411 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950725 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-run-ovn\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.953050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950809 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-kubelet\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.953050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950821 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-log-socket\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.953050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950833 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-sysctl-d\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.953050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950849 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-ovnkube-config\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.953050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950857 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-log-socket\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.953050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950875 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-cnibin\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.953050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950903 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-cni-binary-copy\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.953050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950929 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-cnibin\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.953050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950932 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5n67l\" (UniqueName: \"kubernetes.io/projected/86d416f7-1028-4d19-9a65-2ecc6960eeb7-kube-api-access-5n67l\") pod \"network-metrics-daemon-rt77p\" (UID: \"86d416f7-1028-4d19-9a65-2ecc6960eeb7\") " pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:08.953050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.950979 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqxst\" (UniqueName: \"kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst\") pod \"network-check-target-qpb5w\" (UID: \"a0c4f1c8-43b6-4596-a619-0dd4cba798af\") " pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:08.953050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.951006 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/42370dc2-2d36-49c1-b178-6763d784a3e0-iptables-alerter-script\") pod \"iptables-alerter-k697d\" (UID: \"42370dc2-2d36-49c1-b178-6763d784a3e0\") " pod="openshift-network-operator/iptables-alerter-k697d" Apr 16 13:59:08.953050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.951031 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-systemd-units\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.953050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.951073 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-systemd-units\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.953050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.951266 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-node-log\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.953050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.951298 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-cni-bin\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.953050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.951363 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-host-cni-bin\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.953050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.951368 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-ovnkube-config\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.953050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.951408 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-node-log\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.953658 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.951584 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-cni-binary-copy\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.953658 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.951625 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/42370dc2-2d36-49c1-b178-6763d784a3e0-iptables-alerter-script\") pod \"iptables-alerter-k697d\" (UID: \"42370dc2-2d36-49c1-b178-6763d784a3e0\") " pod="openshift-network-operator/iptables-alerter-k697d" Apr 16 13:59:08.953658 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.951898 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-ovn-node-metrics-cert\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.953658 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.951995 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/083b5510-48e0-4313-a2e7-fca5271e9e0f-agent-certs\") pod \"konnectivity-agent-flg9b\" (UID: \"083b5510-48e0-4313-a2e7-fca5271e9e0f\") " pod="kube-system/konnectivity-agent-flg9b" Apr 16 13:59:08.953658 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.952261 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-tmp\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.953658 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.953033 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-etc-tuned\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.956123 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.956098 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7gm4\" (UniqueName: \"kubernetes.io/projected/56fe5eb2-ae67-4d8a-a719-f51bf68da0d0-kube-api-access-c7gm4\") pod \"node-ca-sqfgf\" (UID: \"56fe5eb2-ae67-4d8a-a719-f51bf68da0d0\") " pod="openshift-image-registry/node-ca-sqfgf" Apr 16 13:59:08.956499 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.956484 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlgds\" (UniqueName: \"kubernetes.io/projected/2b62492e-79c4-4431-a7af-4bcaa0f1c8aa-kube-api-access-xlgds\") pod \"multus-pq9ch\" (UID: \"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa\") " pod="openshift-multus/multus-pq9ch" Apr 16 13:59:08.956612 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.956597 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkrxx\" (UniqueName: \"kubernetes.io/projected/ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9-kube-api-access-wkrxx\") pod \"ovnkube-node-np5bh\" (UID: \"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:08.959210 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.959185 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tcjl\" (UniqueName: \"kubernetes.io/projected/03bde64a-28ea-4a04-a03d-543688a5f10e-kube-api-access-9tcjl\") pod \"aws-ebs-csi-driver-node-xv6kg\" (UID: \"03bde64a-28ea-4a04-a03d-543688a5f10e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:08.962096 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.962072 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hkc5\" (UniqueName: \"kubernetes.io/projected/2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df-kube-api-access-8hkc5\") pod \"tuned-ff2qp\" (UID: \"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df\") " pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:08.962303 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.962287 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9hhs\" (UniqueName: \"kubernetes.io/projected/dfd38cb4-73f0-4cb1-a3ee-4f877e37742f-kube-api-access-b9hhs\") pod \"multus-additional-cni-plugins-pjcmk\" (UID: \"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f\") " pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:08.962501 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.962485 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg47x\" (UniqueName: \"kubernetes.io/projected/42370dc2-2d36-49c1-b178-6763d784a3e0-kube-api-access-kg47x\") pod \"iptables-alerter-k697d\" (UID: \"42370dc2-2d36-49c1-b178-6763d784a3e0\") " pod="openshift-network-operator/iptables-alerter-k697d" Apr 16 13:59:08.963600 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.963580 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97gnc\" (UniqueName: \"kubernetes.io/projected/1c601f7b-2758-4b61-a47e-bdc41ba6fb31-kube-api-access-97gnc\") pod \"node-resolver-ztqms\" (UID: \"1c601f7b-2758-4b61-a47e-bdc41ba6fb31\") " pod="openshift-dns/node-resolver-ztqms" Apr 16 13:59:08.963844 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:08.963829 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:08.963906 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:08.963846 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:08.963906 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:08.963855 2582 projected.go:194] Error preparing data for projected volume kube-api-access-wqxst for pod openshift-network-diagnostics/network-check-target-qpb5w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:08.964006 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:08.963917 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst podName:a0c4f1c8-43b6-4596-a619-0dd4cba798af nodeName:}" failed. No retries permitted until 2026-04-16 13:59:09.463903155 +0000 UTC m=+3.136804440 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wqxst" (UniqueName: "kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst") pod "network-check-target-qpb5w" (UID: "a0c4f1c8-43b6-4596-a619-0dd4cba798af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:08.966596 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:08.966577 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n67l\" (UniqueName: \"kubernetes.io/projected/86d416f7-1028-4d19-9a65-2ecc6960eeb7-kube-api-access-5n67l\") pod \"network-metrics-daemon-rt77p\" (UID: \"86d416f7-1028-4d19-9a65-2ecc6960eeb7\") " pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:09.064474 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.064452 2582 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:09.130506 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.130447 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pq9ch" Apr 16 13:59:09.136290 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:09.136265 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b62492e_79c4_4431_a7af_4bcaa0f1c8aa.slice/crio-2445a11c8479af8c32aace0dd286be1e56eb1f81e06132fe3861531ae6262d68 WatchSource:0}: Error finding container 2445a11c8479af8c32aace0dd286be1e56eb1f81e06132fe3861531ae6262d68: Status 404 returned error can't find the container with id 2445a11c8479af8c32aace0dd286be1e56eb1f81e06132fe3861531ae6262d68 Apr 16 13:59:09.139180 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.139163 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" Apr 16 13:59:09.145261 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.145244 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sqfgf" Apr 16 13:59:09.145667 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:09.145560 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03bde64a_28ea_4a04_a03d_543688a5f10e.slice/crio-d79bddc768a4b131d4b426ce7309ae06c760f68a5a06460f03c23c6b567f9d22 WatchSource:0}: Error finding container d79bddc768a4b131d4b426ce7309ae06c760f68a5a06460f03c23c6b567f9d22: Status 404 returned error can't find the container with id d79bddc768a4b131d4b426ce7309ae06c760f68a5a06460f03c23c6b567f9d22 Apr 16 13:59:09.149984 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.149943 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pjcmk" Apr 16 13:59:09.152039 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:09.151990 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56fe5eb2_ae67_4d8a_a719_f51bf68da0d0.slice/crio-76bd8382c1246764a21f8f3b6ec20260e9595574743fa6dd9a041248f9656e52 WatchSource:0}: Error finding container 76bd8382c1246764a21f8f3b6ec20260e9595574743fa6dd9a041248f9656e52: Status 404 returned error can't find the container with id 76bd8382c1246764a21f8f3b6ec20260e9595574743fa6dd9a041248f9656e52 Apr 16 13:59:09.155242 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.155195 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-k697d" Apr 16 13:59:09.158046 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:09.158010 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfd38cb4_73f0_4cb1_a3ee_4f877e37742f.slice/crio-71f39906b56a9a187968fa367703089af4f046ea79e9fb044823f16daa2c6039 WatchSource:0}: Error finding container 71f39906b56a9a187968fa367703089af4f046ea79e9fb044823f16daa2c6039: Status 404 returned error can't find the container with id 71f39906b56a9a187968fa367703089af4f046ea79e9fb044823f16daa2c6039 Apr 16 13:59:09.160174 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.160156 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:09.162296 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:09.162271 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42370dc2_2d36_49c1_b178_6763d784a3e0.slice/crio-d87ecddb1ac53a372f11b9dbe34098ced3dd515d897088c73fdc94f509f7cc3f WatchSource:0}: Error finding container d87ecddb1ac53a372f11b9dbe34098ced3dd515d897088c73fdc94f509f7cc3f: Status 404 returned error can't find the container with id d87ecddb1ac53a372f11b9dbe34098ced3dd515d897088c73fdc94f509f7cc3f Apr 16 13:59:09.165518 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.165496 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-flg9b" Apr 16 13:59:09.168206 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:09.168184 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff8a1d02_93a9_4b2d_a7f6_95a4d86e0fe9.slice/crio-f3a26a0ae1fb81955dbb8df934b93c2b4e62a21c79569fab289cb0a690591e26 WatchSource:0}: Error finding container f3a26a0ae1fb81955dbb8df934b93c2b4e62a21c79569fab289cb0a690591e26: Status 404 returned error can't find the container with id f3a26a0ae1fb81955dbb8df934b93c2b4e62a21c79569fab289cb0a690591e26 Apr 16 13:59:09.170594 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.170527 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ztqms" Apr 16 13:59:09.173374 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:09.173261 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod083b5510_48e0_4313_a2e7_fca5271e9e0f.slice/crio-912684586abdcb572026eeabf8793fcff917f831c00bec3c76c308c9c4ea3f12 WatchSource:0}: Error finding container 912684586abdcb572026eeabf8793fcff917f831c00bec3c76c308c9c4ea3f12: Status 404 returned error can't find the container with id 912684586abdcb572026eeabf8793fcff917f831c00bec3c76c308c9c4ea3f12 Apr 16 13:59:09.175396 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.175376 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" Apr 16 13:59:09.181651 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:09.181629 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c601f7b_2758_4b61_a47e_bdc41ba6fb31.slice/crio-b5c687d04e9ca90d9e2c2a62b21740a6b2d5ec2e5e9e73ea777cfc8c1d90f163 WatchSource:0}: Error finding container b5c687d04e9ca90d9e2c2a62b21740a6b2d5ec2e5e9e73ea777cfc8c1d90f163: Status 404 returned error can't find the container with id b5c687d04e9ca90d9e2c2a62b21740a6b2d5ec2e5e9e73ea777cfc8c1d90f163 Apr 16 13:59:09.186489 ip-10-0-128-29 kubenswrapper[2582]: W0416 13:59:09.186469 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ea59d91_2e6f_4f2d_b4dd_88ecb1bf00df.slice/crio-2d31689696c669b68e10b4e2c600830f942cd5807a64b2740b68db53205077b0 WatchSource:0}: Error finding container 2d31689696c669b68e10b4e2c600830f942cd5807a64b2740b68db53205077b0: Status 404 returned error can't find the container with id 2d31689696c669b68e10b4e2c600830f942cd5807a64b2740b68db53205077b0 Apr 16 13:59:09.453584 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.453448 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs\") pod \"network-metrics-daemon-rt77p\" (UID: \"86d416f7-1028-4d19-9a65-2ecc6960eeb7\") " pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:09.453752 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:09.453630 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:09.453752 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:09.453716 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs podName:86d416f7-1028-4d19-9a65-2ecc6960eeb7 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:10.453693216 +0000 UTC m=+4.126594513 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs") pod "network-metrics-daemon-rt77p" (UID: "86d416f7-1028-4d19-9a65-2ecc6960eeb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:09.453752 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.453745 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret\") pod \"global-pull-secret-syncer-hxjfz\" (UID: \"80f4a0f5-8232-4155-a115-e7470360cc63\") " pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:09.454012 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:09.453888 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:09.454012 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:09.453933 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret podName:80f4a0f5-8232-4155-a115-e7470360cc63 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:10.453920414 +0000 UTC m=+4.126821689 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret") pod "global-pull-secret-syncer-hxjfz" (UID: "80f4a0f5-8232-4155-a115-e7470360cc63") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:09.555090 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.555030 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqxst\" (UniqueName: \"kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst\") pod \"network-check-target-qpb5w\" (UID: \"a0c4f1c8-43b6-4596-a619-0dd4cba798af\") " pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:09.555253 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:09.555225 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:09.555314 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:09.555252 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:09.555314 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:09.555268 2582 projected.go:194] Error preparing data for projected volume kube-api-access-wqxst for pod openshift-network-diagnostics/network-check-target-qpb5w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:09.555414 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:09.555336 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst podName:a0c4f1c8-43b6-4596-a619-0dd4cba798af nodeName:}" failed. No retries permitted until 2026-04-16 13:59:10.555316513 +0000 UTC m=+4.228217810 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wqxst" (UniqueName: "kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst") pod "network-check-target-qpb5w" (UID: "a0c4f1c8-43b6-4596-a619-0dd4cba798af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:09.932489 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.932442 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:07 +0000 UTC" deadline="2027-11-15 18:41:12.141610412 +0000 UTC" Apr 16 13:59:09.932489 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.932485 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13876h42m2.209128982s" Apr 16 13:59:09.936129 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.935646 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:09.936129 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:09.935775 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qpb5w" podUID="a0c4f1c8-43b6-4596-a619-0dd4cba798af" Apr 16 13:59:09.944372 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.944339 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pjcmk" event={"ID":"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f","Type":"ContainerStarted","Data":"71f39906b56a9a187968fa367703089af4f046ea79e9fb044823f16daa2c6039"} Apr 16 13:59:09.945576 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.945537 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sqfgf" event={"ID":"56fe5eb2-ae67-4d8a-a719-f51bf68da0d0","Type":"ContainerStarted","Data":"76bd8382c1246764a21f8f3b6ec20260e9595574743fa6dd9a041248f9656e52"} Apr 16 13:59:09.946633 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.946596 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" event={"ID":"03bde64a-28ea-4a04-a03d-543688a5f10e","Type":"ContainerStarted","Data":"d79bddc768a4b131d4b426ce7309ae06c760f68a5a06460f03c23c6b567f9d22"} Apr 16 13:59:09.947799 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.947757 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pq9ch" event={"ID":"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa","Type":"ContainerStarted","Data":"2445a11c8479af8c32aace0dd286be1e56eb1f81e06132fe3861531ae6262d68"} Apr 16 13:59:09.948821 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.948782 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" event={"ID":"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df","Type":"ContainerStarted","Data":"2d31689696c669b68e10b4e2c600830f942cd5807a64b2740b68db53205077b0"} Apr 16 13:59:09.950338 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.950308 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ztqms" event={"ID":"1c601f7b-2758-4b61-a47e-bdc41ba6fb31","Type":"ContainerStarted","Data":"b5c687d04e9ca90d9e2c2a62b21740a6b2d5ec2e5e9e73ea777cfc8c1d90f163"} Apr 16 13:59:09.951515 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.951476 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-flg9b" event={"ID":"083b5510-48e0-4313-a2e7-fca5271e9e0f","Type":"ContainerStarted","Data":"912684586abdcb572026eeabf8793fcff917f831c00bec3c76c308c9c4ea3f12"} Apr 16 13:59:09.952728 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.952706 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-k697d" event={"ID":"42370dc2-2d36-49c1-b178-6763d784a3e0","Type":"ContainerStarted","Data":"d87ecddb1ac53a372f11b9dbe34098ced3dd515d897088c73fdc94f509f7cc3f"} Apr 16 13:59:09.954029 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:09.953998 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" event={"ID":"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9","Type":"ContainerStarted","Data":"f3a26a0ae1fb81955dbb8df934b93c2b4e62a21c79569fab289cb0a690591e26"} Apr 16 13:59:10.463936 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:10.463140 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret\") pod \"global-pull-secret-syncer-hxjfz\" (UID: \"80f4a0f5-8232-4155-a115-e7470360cc63\") " pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:10.463936 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:10.463211 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs\") pod \"network-metrics-daemon-rt77p\" (UID: \"86d416f7-1028-4d19-9a65-2ecc6960eeb7\") " pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:10.463936 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:10.463377 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:10.463936 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:10.463441 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs podName:86d416f7-1028-4d19-9a65-2ecc6960eeb7 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:12.46342358 +0000 UTC m=+6.136324866 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs") pod "network-metrics-daemon-rt77p" (UID: "86d416f7-1028-4d19-9a65-2ecc6960eeb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:10.463936 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:10.463843 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:10.463936 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:10.463896 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret podName:80f4a0f5-8232-4155-a115-e7470360cc63 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:12.46387925 +0000 UTC m=+6.136780534 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret") pod "global-pull-secret-syncer-hxjfz" (UID: "80f4a0f5-8232-4155-a115-e7470360cc63") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:10.564837 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:10.564225 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqxst\" (UniqueName: \"kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst\") pod \"network-check-target-qpb5w\" (UID: \"a0c4f1c8-43b6-4596-a619-0dd4cba798af\") " pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:10.564837 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:10.564382 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:10.564837 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:10.564399 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:10.564837 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:10.564411 2582 projected.go:194] Error preparing data for projected volume kube-api-access-wqxst for pod openshift-network-diagnostics/network-check-target-qpb5w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:10.564837 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:10.564469 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst podName:a0c4f1c8-43b6-4596-a619-0dd4cba798af nodeName:}" failed. No retries permitted until 2026-04-16 13:59:12.564451507 +0000 UTC m=+6.237352801 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wqxst" (UniqueName: "kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst") pod "network-check-target-qpb5w" (UID: "a0c4f1c8-43b6-4596-a619-0dd4cba798af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:10.938163 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:10.937377 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:10.938163 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:10.937503 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxjfz" podUID="80f4a0f5-8232-4155-a115-e7470360cc63" Apr 16 13:59:10.938163 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:10.937965 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:10.938163 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:10.938066 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rt77p" podUID="86d416f7-1028-4d19-9a65-2ecc6960eeb7" Apr 16 13:59:11.936127 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:11.935633 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:11.936127 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:11.935780 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qpb5w" podUID="a0c4f1c8-43b6-4596-a619-0dd4cba798af" Apr 16 13:59:12.480286 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:12.480251 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs\") pod \"network-metrics-daemon-rt77p\" (UID: \"86d416f7-1028-4d19-9a65-2ecc6960eeb7\") " pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:12.480757 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:12.480346 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret\") pod \"global-pull-secret-syncer-hxjfz\" (UID: \"80f4a0f5-8232-4155-a115-e7470360cc63\") " pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:12.480757 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:12.480390 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:12.480757 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:12.480466 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs podName:86d416f7-1028-4d19-9a65-2ecc6960eeb7 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:16.480445623 +0000 UTC m=+10.153346906 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs") pod "network-metrics-daemon-rt77p" (UID: "86d416f7-1028-4d19-9a65-2ecc6960eeb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:12.480757 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:12.480493 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:12.480757 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:12.480544 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret podName:80f4a0f5-8232-4155-a115-e7470360cc63 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:16.480529293 +0000 UTC m=+10.153430565 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret") pod "global-pull-secret-syncer-hxjfz" (UID: "80f4a0f5-8232-4155-a115-e7470360cc63") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:12.580698 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:12.580650 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqxst\" (UniqueName: \"kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst\") pod \"network-check-target-qpb5w\" (UID: \"a0c4f1c8-43b6-4596-a619-0dd4cba798af\") " pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:12.580873 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:12.580841 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:12.580873 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:12.580860 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:12.580873 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:12.580871 2582 projected.go:194] Error preparing data for projected volume kube-api-access-wqxst for pod openshift-network-diagnostics/network-check-target-qpb5w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:12.581058 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:12.580930 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst podName:a0c4f1c8-43b6-4596-a619-0dd4cba798af nodeName:}" failed. No retries permitted until 2026-04-16 13:59:16.580910378 +0000 UTC m=+10.253811670 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wqxst" (UniqueName: "kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst") pod "network-check-target-qpb5w" (UID: "a0c4f1c8-43b6-4596-a619-0dd4cba798af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:12.936315 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:12.935699 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:12.936315 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:12.935844 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rt77p" podUID="86d416f7-1028-4d19-9a65-2ecc6960eeb7" Apr 16 13:59:12.936529 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:12.936395 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:12.936529 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:12.936494 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxjfz" podUID="80f4a0f5-8232-4155-a115-e7470360cc63" Apr 16 13:59:13.935098 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:13.934644 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:13.935098 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:13.934851 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qpb5w" podUID="a0c4f1c8-43b6-4596-a619-0dd4cba798af" Apr 16 13:59:14.935165 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:14.935138 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:14.935565 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:14.935182 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:14.935565 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:14.935267 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxjfz" podUID="80f4a0f5-8232-4155-a115-e7470360cc63" Apr 16 13:59:14.935565 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:14.935368 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rt77p" podUID="86d416f7-1028-4d19-9a65-2ecc6960eeb7" Apr 16 13:59:15.934960 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:15.934889 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:15.935154 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:15.935018 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qpb5w" podUID="a0c4f1c8-43b6-4596-a619-0dd4cba798af" Apr 16 13:59:16.512984 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:16.512946 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret\") pod \"global-pull-secret-syncer-hxjfz\" (UID: \"80f4a0f5-8232-4155-a115-e7470360cc63\") " pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:16.513459 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:16.513020 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs\") pod \"network-metrics-daemon-rt77p\" (UID: \"86d416f7-1028-4d19-9a65-2ecc6960eeb7\") " pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:16.513459 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:16.513108 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:16.513459 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:16.513171 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs podName:86d416f7-1028-4d19-9a65-2ecc6960eeb7 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:24.513153034 +0000 UTC m=+18.186054312 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs") pod "network-metrics-daemon-rt77p" (UID: "86d416f7-1028-4d19-9a65-2ecc6960eeb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:16.513459 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:16.513108 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:16.513459 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:16.513222 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret podName:80f4a0f5-8232-4155-a115-e7470360cc63 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:24.513211652 +0000 UTC m=+18.186112937 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret") pod "global-pull-secret-syncer-hxjfz" (UID: "80f4a0f5-8232-4155-a115-e7470360cc63") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:16.613798 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:16.613752 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqxst\" (UniqueName: \"kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst\") pod \"network-check-target-qpb5w\" (UID: \"a0c4f1c8-43b6-4596-a619-0dd4cba798af\") " pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:16.613954 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:16.613936 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:16.614015 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:16.613960 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:16.614015 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:16.613973 2582 projected.go:194] Error preparing data for projected volume kube-api-access-wqxst for pod openshift-network-diagnostics/network-check-target-qpb5w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:16.614111 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:16.614035 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst podName:a0c4f1c8-43b6-4596-a619-0dd4cba798af nodeName:}" failed. No retries permitted until 2026-04-16 13:59:24.614016256 +0000 UTC m=+18.286917543 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-wqxst" (UniqueName: "kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst") pod "network-check-target-qpb5w" (UID: "a0c4f1c8-43b6-4596-a619-0dd4cba798af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:16.936999 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:16.936022 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:16.936999 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:16.936098 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxjfz" podUID="80f4a0f5-8232-4155-a115-e7470360cc63" Apr 16 13:59:16.936999 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:16.936443 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:16.936999 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:16.936546 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rt77p" podUID="86d416f7-1028-4d19-9a65-2ecc6960eeb7" Apr 16 13:59:17.935241 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:17.935061 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:17.935241 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:17.935185 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qpb5w" podUID="a0c4f1c8-43b6-4596-a619-0dd4cba798af" Apr 16 13:59:18.935238 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:18.935203 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:18.935238 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:18.935219 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:18.935784 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:18.935338 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxjfz" podUID="80f4a0f5-8232-4155-a115-e7470360cc63" Apr 16 13:59:18.935784 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:18.935472 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rt77p" podUID="86d416f7-1028-4d19-9a65-2ecc6960eeb7" Apr 16 13:59:19.935594 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:19.935558 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:19.936005 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:19.935664 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qpb5w" podUID="a0c4f1c8-43b6-4596-a619-0dd4cba798af" Apr 16 13:59:20.934794 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:20.934763 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:20.934975 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:20.934878 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxjfz" podUID="80f4a0f5-8232-4155-a115-e7470360cc63" Apr 16 13:59:20.934975 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:20.934763 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:20.935092 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:20.934980 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rt77p" podUID="86d416f7-1028-4d19-9a65-2ecc6960eeb7" Apr 16 13:59:21.935246 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:21.935035 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:21.935647 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:21.935326 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qpb5w" podUID="a0c4f1c8-43b6-4596-a619-0dd4cba798af" Apr 16 13:59:22.935263 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:22.935227 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:22.935657 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:22.935232 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:22.935657 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:22.935350 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxjfz" podUID="80f4a0f5-8232-4155-a115-e7470360cc63" Apr 16 13:59:22.935657 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:22.935419 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rt77p" podUID="86d416f7-1028-4d19-9a65-2ecc6960eeb7" Apr 16 13:59:23.934811 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:23.934773 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:23.934964 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:23.934899 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qpb5w" podUID="a0c4f1c8-43b6-4596-a619-0dd4cba798af" Apr 16 13:59:24.577739 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:24.577702 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs\") pod \"network-metrics-daemon-rt77p\" (UID: \"86d416f7-1028-4d19-9a65-2ecc6960eeb7\") " pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:24.578180 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:24.577792 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret\") pod \"global-pull-secret-syncer-hxjfz\" (UID: \"80f4a0f5-8232-4155-a115-e7470360cc63\") " pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:24.578180 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:24.577880 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:24.578180 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:24.577888 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:24.578180 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:24.577960 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret podName:80f4a0f5-8232-4155-a115-e7470360cc63 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:40.577941402 +0000 UTC m=+34.250842689 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret") pod "global-pull-secret-syncer-hxjfz" (UID: "80f4a0f5-8232-4155-a115-e7470360cc63") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:24.578180 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:24.577979 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs podName:86d416f7-1028-4d19-9a65-2ecc6960eeb7 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:40.577969996 +0000 UTC m=+34.250871270 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs") pod "network-metrics-daemon-rt77p" (UID: "86d416f7-1028-4d19-9a65-2ecc6960eeb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:24.678699 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:24.678652 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqxst\" (UniqueName: \"kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst\") pod \"network-check-target-qpb5w\" (UID: \"a0c4f1c8-43b6-4596-a619-0dd4cba798af\") " pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:24.678875 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:24.678814 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:24.678875 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:24.678838 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:24.678875 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:24.678850 2582 projected.go:194] Error preparing data for projected volume kube-api-access-wqxst for pod openshift-network-diagnostics/network-check-target-qpb5w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:24.679005 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:24.678904 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst podName:a0c4f1c8-43b6-4596-a619-0dd4cba798af nodeName:}" failed. No retries permitted until 2026-04-16 13:59:40.678886925 +0000 UTC m=+34.351788208 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-wqxst" (UniqueName: "kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst") pod "network-check-target-qpb5w" (UID: "a0c4f1c8-43b6-4596-a619-0dd4cba798af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:24.934716 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:24.934623 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:24.934883 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:24.934769 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxjfz" podUID="80f4a0f5-8232-4155-a115-e7470360cc63" Apr 16 13:59:24.934883 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:24.934843 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:24.934988 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:24.934943 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rt77p" podUID="86d416f7-1028-4d19-9a65-2ecc6960eeb7" Apr 16 13:59:25.935099 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:25.935069 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:25.935495 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:25.935177 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qpb5w" podUID="a0c4f1c8-43b6-4596-a619-0dd4cba798af" Apr 16 13:59:26.936232 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:26.936082 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:26.936584 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:26.936148 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:26.936584 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:26.936315 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxjfz" podUID="80f4a0f5-8232-4155-a115-e7470360cc63" Apr 16 13:59:26.936584 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:26.936372 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rt77p" podUID="86d416f7-1028-4d19-9a65-2ecc6960eeb7" Apr 16 13:59:26.985739 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:26.985717 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 13:59:26.986087 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:26.986064 2582 generic.go:358] "Generic (PLEG): container finished" podID="ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9" containerID="489d4a876884d4e6e4f739f14f868d4e5ef7276b6101456796e84b1f3bdc7ab1" exitCode=1 Apr 16 13:59:26.986179 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:26.986123 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" event={"ID":"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9","Type":"ContainerStarted","Data":"4ff7769ae9119b3a6aca2426e31e5456afa8f3364ed23bcc99a59743d1796e0d"} Apr 16 13:59:26.986179 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:26.986159 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" event={"ID":"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9","Type":"ContainerStarted","Data":"51eadb5702377408f523facd5cbc3657a7ab4b8505d68825cc07832f7836b78c"} Apr 16 13:59:26.986179 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:26.986173 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" event={"ID":"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9","Type":"ContainerStarted","Data":"c7b64ce234bacbfb16c0fbf877b265734667bad9f5c41fb25773c7f2afb84445"} Apr 16 13:59:26.986294 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:26.986186 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" event={"ID":"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9","Type":"ContainerDied","Data":"489d4a876884d4e6e4f739f14f868d4e5ef7276b6101456796e84b1f3bdc7ab1"} Apr 16 13:59:26.986294 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:26.986201 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" event={"ID":"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9","Type":"ContainerStarted","Data":"658a2b870e96f3db38178c1246b6a29ef6caf3f8ada97fd8e131d3cc93afe51b"} Apr 16 13:59:26.987594 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:26.987569 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pq9ch" event={"ID":"2b62492e-79c4-4431-a7af-4bcaa0f1c8aa","Type":"ContainerStarted","Data":"6a8a738e554528fc834e1b5be5825a92c54bbc1f53a5fa12b175c98795d275bb"} Apr 16 13:59:26.989038 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:26.989018 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-29.ec2.internal" event={"ID":"80eaaec527b04d922efdac38ef2d0c20","Type":"ContainerStarted","Data":"240415a6c80981a4ccf5f691813abd5113931f881184e3ae51d49cc2a6b99b4c"} Apr 16 13:59:26.991251 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:26.991229 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" event={"ID":"2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df","Type":"ContainerStarted","Data":"24196953c4f8671a1e50986394e7b7ad52f2c4a844db3db51c2b40d7ba04386c"} Apr 16 13:59:27.009337 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:27.009301 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pq9ch" podStartSLOduration=3.004872925 podStartE2EDuration="20.009291075s" podCreationTimestamp="2026-04-16 13:59:07 +0000 UTC" firstStartedPulling="2026-04-16 13:59:09.137773057 +0000 UTC m=+2.810674333" lastFinishedPulling="2026-04-16 13:59:26.142191209 +0000 UTC m=+19.815092483" observedRunningTime="2026-04-16 13:59:27.005195069 +0000 UTC m=+20.678096363" watchObservedRunningTime="2026-04-16 13:59:27.009291075 +0000 UTC m=+20.682192368" Apr 16 13:59:27.039841 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:27.039782 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-29.ec2.internal" podStartSLOduration=20.039763136 podStartE2EDuration="20.039763136s" podCreationTimestamp="2026-04-16 13:59:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:27.039040985 +0000 UTC m=+20.711942280" watchObservedRunningTime="2026-04-16 13:59:27.039763136 +0000 UTC m=+20.712664431" Apr 16 13:59:27.040016 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:27.039868 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ff2qp" podStartSLOduration=3.11962899 podStartE2EDuration="20.039860661s" podCreationTimestamp="2026-04-16 13:59:07 +0000 UTC" firstStartedPulling="2026-04-16 13:59:09.188608157 +0000 UTC m=+2.861509434" lastFinishedPulling="2026-04-16 13:59:26.108839818 +0000 UTC m=+19.781741105" observedRunningTime="2026-04-16 13:59:27.021956122 +0000 UTC m=+20.694857417" watchObservedRunningTime="2026-04-16 13:59:27.039860661 +0000 UTC m=+20.712761957" Apr 16 13:59:27.935430 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:27.935394 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:27.935586 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:27.935488 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qpb5w" podUID="a0c4f1c8-43b6-4596-a619-0dd4cba798af" Apr 16 13:59:27.994067 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:27.994029 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-k697d" event={"ID":"42370dc2-2d36-49c1-b178-6763d784a3e0","Type":"ContainerStarted","Data":"6753549c41c6bfa6166dd0923e022765128588bbb8a32f8284d1a8d52fcc8887"} Apr 16 13:59:27.996251 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:27.996230 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 13:59:27.996574 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:27.996553 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" event={"ID":"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9","Type":"ContainerStarted","Data":"d5da276c02d9b0581a48fb8ed5f39b16e3c161e8a557a8a84389fa906c1ac026"} Apr 16 13:59:28.009128 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:28.009090 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-k697d" podStartSLOduration=4.080482839 podStartE2EDuration="21.009078322s" podCreationTimestamp="2026-04-16 13:59:07 +0000 UTC" firstStartedPulling="2026-04-16 13:59:09.164571507 +0000 UTC m=+2.837472778" lastFinishedPulling="2026-04-16 13:59:26.093166975 +0000 UTC m=+19.766068261" observedRunningTime="2026-04-16 13:59:28.008893307 +0000 UTC m=+21.681794601" watchObservedRunningTime="2026-04-16 13:59:28.009078322 +0000 UTC m=+21.681979616" Apr 16 13:59:28.937575 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:28.937543 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:28.937772 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:28.937595 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:28.937772 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:28.937676 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rt77p" podUID="86d416f7-1028-4d19-9a65-2ecc6960eeb7" Apr 16 13:59:28.937864 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:28.937770 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxjfz" podUID="80f4a0f5-8232-4155-a115-e7470360cc63" Apr 16 13:59:29.935728 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:29.935517 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:29.936251 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:29.935827 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qpb5w" podUID="a0c4f1c8-43b6-4596-a619-0dd4cba798af" Apr 16 13:59:30.001151 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:30.001130 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 13:59:30.001403 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:30.001383 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" event={"ID":"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9","Type":"ContainerStarted","Data":"edb14ee1fe2840e0d3318aee20969a9795d71d0ed35c1aee6ee5deb281b3b5fe"} Apr 16 13:59:30.937943 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:30.937916 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:30.938337 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:30.937920 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:30.938337 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:30.938022 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxjfz" podUID="80f4a0f5-8232-4155-a115-e7470360cc63" Apr 16 13:59:30.938337 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:30.938089 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rt77p" podUID="86d416f7-1028-4d19-9a65-2ecc6960eeb7" Apr 16 13:59:31.006117 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:31.006081 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" event={"ID":"03bde64a-28ea-4a04-a03d-543688a5f10e","Type":"ContainerStarted","Data":"e1f4e71c5a9d0b5a1932f69771e9be9bb088a704ba546079d952189dc6671276"} Apr 16 13:59:31.007501 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:31.007477 2582 generic.go:358] "Generic (PLEG): container finished" podID="eb8ae9f57d7d43deceabe76dd74043d6" containerID="3ca906bd0537d1738a2b40f373bc79729edb8d5846479c7f7dd5db7bee6d6acc" exitCode=0 Apr 16 13:59:31.007632 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:31.007536 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal" event={"ID":"eb8ae9f57d7d43deceabe76dd74043d6","Type":"ContainerDied","Data":"3ca906bd0537d1738a2b40f373bc79729edb8d5846479c7f7dd5db7bee6d6acc"} Apr 16 13:59:31.008777 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:31.008753 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ztqms" event={"ID":"1c601f7b-2758-4b61-a47e-bdc41ba6fb31","Type":"ContainerStarted","Data":"2ea500cfee4d87d6f81be07bdf4e8eb1236dc8baf49d53e61c7802be0d682885"} Apr 16 13:59:31.010029 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:31.009989 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-flg9b" event={"ID":"083b5510-48e0-4313-a2e7-fca5271e9e0f","Type":"ContainerStarted","Data":"d65a44777ae4f53f0c3b12b09eb03a61c8476f01bbd66b302f6b7fef2fdaa319"} Apr 16 13:59:31.011298 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:31.011273 2582 generic.go:358] "Generic (PLEG): container finished" podID="dfd38cb4-73f0-4cb1-a3ee-4f877e37742f" containerID="370d37e4f2d3d85f366a5227566633bd8fb88a6ff58dccc6f4b3bb72f923742f" exitCode=0 Apr 16 13:59:31.011382 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:31.011342 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pjcmk" event={"ID":"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f","Type":"ContainerDied","Data":"370d37e4f2d3d85f366a5227566633bd8fb88a6ff58dccc6f4b3bb72f923742f"} Apr 16 13:59:31.012482 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:31.012396 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sqfgf" event={"ID":"56fe5eb2-ae67-4d8a-a719-f51bf68da0d0","Type":"ContainerStarted","Data":"3b41b992a6e96c0f7c9e8fad06bc10ea7c675c94ce4a6274f0fbc0376b6656b1"} Apr 16 13:59:31.037918 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:31.037882 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sqfgf" podStartSLOduration=7.08726457 podStartE2EDuration="24.037867007s" podCreationTimestamp="2026-04-16 13:59:07 +0000 UTC" firstStartedPulling="2026-04-16 13:59:09.153876339 +0000 UTC m=+2.826777611" lastFinishedPulling="2026-04-16 13:59:26.104478761 +0000 UTC m=+19.777380048" observedRunningTime="2026-04-16 13:59:31.037843739 +0000 UTC m=+24.710745044" watchObservedRunningTime="2026-04-16 13:59:31.037867007 +0000 UTC m=+24.710768301" Apr 16 13:59:31.079358 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:31.079316 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-flg9b" podStartSLOduration=11.496282153 podStartE2EDuration="24.079305743s" podCreationTimestamp="2026-04-16 13:59:07 +0000 UTC" firstStartedPulling="2026-04-16 13:59:09.175519589 +0000 UTC m=+2.848420863" lastFinishedPulling="2026-04-16 13:59:21.758543169 +0000 UTC m=+15.431444453" observedRunningTime="2026-04-16 13:59:31.07920351 +0000 UTC m=+24.752104805" watchObservedRunningTime="2026-04-16 13:59:31.079305743 +0000 UTC m=+24.752207073" Apr 16 13:59:31.935544 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:31.935353 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:31.935766 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:31.935633 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qpb5w" podUID="a0c4f1c8-43b6-4596-a619-0dd4cba798af" Apr 16 13:59:32.015740 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:32.015706 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal" event={"ID":"eb8ae9f57d7d43deceabe76dd74043d6","Type":"ContainerStarted","Data":"35a4d0ead04f9169c7003d37f473ff247b6a7940270285760d5dfdeec5dbb66b"} Apr 16 13:59:32.018407 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:32.018386 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 13:59:32.018760 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:32.018732 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" event={"ID":"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9","Type":"ContainerStarted","Data":"90c139d26a41d36e2214404f4274a046fc4023d3a283c4109604fd51d6e85a54"} Apr 16 13:59:32.019290 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:32.019273 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:32.019354 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:32.019343 2582 scope.go:117] "RemoveContainer" containerID="489d4a876884d4e6e4f739f14f868d4e5ef7276b6101456796e84b1f3bdc7ab1" Apr 16 13:59:32.032076 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:32.032040 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ztqms" podStartSLOduration=8.112328069 podStartE2EDuration="25.032029465s" podCreationTimestamp="2026-04-16 13:59:07 +0000 UTC" firstStartedPulling="2026-04-16 13:59:09.18359306 +0000 UTC m=+2.856494346" lastFinishedPulling="2026-04-16 13:59:26.103294459 +0000 UTC m=+19.776195742" observedRunningTime="2026-04-16 13:59:31.094932653 +0000 UTC m=+24.767833947" watchObservedRunningTime="2026-04-16 13:59:32.032029465 +0000 UTC m=+25.704930759" Apr 16 13:59:32.032194 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:32.032107 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-29.ec2.internal" podStartSLOduration=25.032103384 podStartE2EDuration="25.032103384s" podCreationTimestamp="2026-04-16 13:59:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:32.031769613 +0000 UTC m=+25.704670907" watchObservedRunningTime="2026-04-16 13:59:32.032103384 +0000 UTC m=+25.705004677" Apr 16 13:59:32.033218 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:32.033205 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:32.252196 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:32.252174 2582 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:59:32.910052 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:32.909739 2582 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:59:32.252192978Z","UUID":"5fade423-b624-469c-b4fb-f5790d683b15","Handler":null,"Name":"","Endpoint":""} Apr 16 13:59:32.912597 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:32.912572 2582 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:59:32.912716 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:32.912609 2582 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:59:32.938129 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:32.938102 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:32.938253 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:32.938102 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:32.938253 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:32.938231 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rt77p" podUID="86d416f7-1028-4d19-9a65-2ecc6960eeb7" Apr 16 13:59:32.938367 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:32.938315 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxjfz" podUID="80f4a0f5-8232-4155-a115-e7470360cc63" Apr 16 13:59:33.024530 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:33.024502 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 13:59:33.025239 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:33.024892 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" event={"ID":"ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9","Type":"ContainerStarted","Data":"270633e2ec89ab75750ccec865b4c847b329183212c9dd4529c1d31623805823"} Apr 16 13:59:33.025300 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:33.025278 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:33.025347 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:33.025300 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:33.027559 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:33.027537 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" event={"ID":"03bde64a-28ea-4a04-a03d-543688a5f10e","Type":"ContainerStarted","Data":"48ca13ec4826df28db5cbb751962212fccbe7e3ffc05b5ae4f2f474e2dba1bb7"} Apr 16 13:59:33.042885 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:33.042860 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 13:59:33.055411 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:33.055363 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" podStartSLOduration=8.881679744 podStartE2EDuration="26.055348519s" podCreationTimestamp="2026-04-16 13:59:07 +0000 UTC" firstStartedPulling="2026-04-16 13:59:09.171388846 +0000 UTC m=+2.844290132" lastFinishedPulling="2026-04-16 13:59:26.34505763 +0000 UTC m=+20.017958907" observedRunningTime="2026-04-16 13:59:33.054764073 +0000 UTC m=+26.727665367" watchObservedRunningTime="2026-04-16 13:59:33.055348519 +0000 UTC m=+26.728249812" Apr 16 13:59:33.256413 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:33.256379 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-flg9b" Apr 16 13:59:33.257050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:33.257026 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-flg9b" Apr 16 13:59:33.567254 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:33.567004 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hxjfz"] Apr 16 13:59:33.567254 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:33.567117 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:33.567254 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:33.567220 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxjfz" podUID="80f4a0f5-8232-4155-a115-e7470360cc63" Apr 16 13:59:33.569984 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:33.569955 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rt77p"] Apr 16 13:59:33.570109 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:33.570040 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:33.570178 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:33.570135 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rt77p" podUID="86d416f7-1028-4d19-9a65-2ecc6960eeb7" Apr 16 13:59:33.572813 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:33.572715 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qpb5w"] Apr 16 13:59:33.572813 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:33.572814 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:33.572976 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:33.572897 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qpb5w" podUID="a0c4f1c8-43b6-4596-a619-0dd4cba798af" Apr 16 13:59:34.030291 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:34.030254 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-flg9b" Apr 16 13:59:34.030742 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:34.030448 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-flg9b" Apr 16 13:59:34.938220 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:34.938187 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:34.938220 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:34.938208 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:34.938450 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:34.938189 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:34.938450 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:34.938292 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qpb5w" podUID="a0c4f1c8-43b6-4596-a619-0dd4cba798af" Apr 16 13:59:34.938450 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:34.938370 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxjfz" podUID="80f4a0f5-8232-4155-a115-e7470360cc63" Apr 16 13:59:34.938450 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:34.938442 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rt77p" podUID="86d416f7-1028-4d19-9a65-2ecc6960eeb7" Apr 16 13:59:36.937750 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:36.937588 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:36.938081 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:36.937588 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:36.938081 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:36.937825 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qpb5w" podUID="a0c4f1c8-43b6-4596-a619-0dd4cba798af" Apr 16 13:59:36.938081 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:36.937885 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxjfz" podUID="80f4a0f5-8232-4155-a115-e7470360cc63" Apr 16 13:59:36.938081 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:36.937599 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:36.938081 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:36.937981 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rt77p" podUID="86d416f7-1028-4d19-9a65-2ecc6960eeb7" Apr 16 13:59:37.036148 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:37.036121 2582 generic.go:358] "Generic (PLEG): container finished" podID="dfd38cb4-73f0-4cb1-a3ee-4f877e37742f" containerID="b40f05308c014faf4fb5655a2a2fac6ff021a84ab30c91cfc9aa19873073cb7e" exitCode=0 Apr 16 13:59:37.036287 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:37.036186 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pjcmk" event={"ID":"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f","Type":"ContainerDied","Data":"b40f05308c014faf4fb5655a2a2fac6ff021a84ab30c91cfc9aa19873073cb7e"} Apr 16 13:59:37.038224 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:37.038204 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" event={"ID":"03bde64a-28ea-4a04-a03d-543688a5f10e","Type":"ContainerStarted","Data":"9018762b8f3dedb183c6893eb13aa9f7e40b41e76a8fcd03c7f669fd2cf2aa54"} Apr 16 13:59:38.934801 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:38.934769 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:38.935165 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:38.934874 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qpb5w" podUID="a0c4f1c8-43b6-4596-a619-0dd4cba798af" Apr 16 13:59:38.935165 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:38.934875 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:38.935165 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:38.934884 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:38.935165 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:38.934964 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rt77p" podUID="86d416f7-1028-4d19-9a65-2ecc6960eeb7" Apr 16 13:59:38.935165 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:38.935022 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hxjfz" podUID="80f4a0f5-8232-4155-a115-e7470360cc63" Apr 16 13:59:39.043256 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.043194 2582 generic.go:358] "Generic (PLEG): container finished" podID="dfd38cb4-73f0-4cb1-a3ee-4f877e37742f" containerID="4515223cd5f24f02f241ddca6dab49922f90e122340e589d60a6c5115dde23ba" exitCode=0 Apr 16 13:59:39.043256 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.043234 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pjcmk" event={"ID":"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f","Type":"ContainerDied","Data":"4515223cd5f24f02f241ddca6dab49922f90e122340e589d60a6c5115dde23ba"} Apr 16 13:59:39.068264 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.068218 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xv6kg" podStartSLOduration=5.187071509 podStartE2EDuration="32.068204044s" podCreationTimestamp="2026-04-16 13:59:07 +0000 UTC" firstStartedPulling="2026-04-16 13:59:09.150828924 +0000 UTC m=+2.823730203" lastFinishedPulling="2026-04-16 13:59:36.031961462 +0000 UTC m=+29.704862738" observedRunningTime="2026-04-16 13:59:37.084993396 +0000 UTC m=+30.757894691" watchObservedRunningTime="2026-04-16 13:59:39.068204044 +0000 UTC m=+32.741105367" Apr 16 13:59:39.612571 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.612503 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-29.ec2.internal" event="NodeReady" Apr 16 13:59:39.612737 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.612636 2582 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 13:59:39.656527 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.656481 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w"] Apr 16 13:59:39.664911 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.664888 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.669811 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.669757 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 13:59:39.669963 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.669945 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7shls\"" Apr 16 13:59:39.670288 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.670263 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 13:59:39.670376 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.670338 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 13:59:39.674968 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.674949 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 13:59:39.676277 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.676257 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w"] Apr 16 13:59:39.677988 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.677964 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8cbxc"] Apr 16 13:59:39.688520 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.688501 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vf79m"] Apr 16 13:59:39.688654 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.688639 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8cbxc" Apr 16 13:59:39.690996 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.690953 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tpzlf\"" Apr 16 13:59:39.691268 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.691250 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 13:59:39.691379 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.691313 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 13:59:39.696427 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.696114 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8cbxc"] Apr 16 13:59:39.696427 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.696226 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vf79m" Apr 16 13:59:39.699229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.698615 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 13:59:39.699229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.698659 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 13:59:39.699229 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.698720 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 13:59:39.699464 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.699449 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbpqw\"" Apr 16 13:59:39.703026 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.703010 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vf79m"] Apr 16 13:59:39.796848 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.796818 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d49qg\" (UniqueName: \"kubernetes.io/projected/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-kube-api-access-d49qg\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 13:59:39.796848 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.796850 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g445x\" (UniqueName: \"kubernetes.io/projected/41223147-714d-4ec2-a7b7-5febd776c247-kube-api-access-g445x\") pod \"ingress-canary-vf79m\" (UID: \"41223147-714d-4ec2-a7b7-5febd776c247\") " pod="openshift-ingress-canary/ingress-canary-vf79m" Apr 16 13:59:39.797003 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.796877 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert\") pod \"ingress-canary-vf79m\" (UID: \"41223147-714d-4ec2-a7b7-5febd776c247\") " pod="openshift-ingress-canary/ingress-canary-vf79m" Apr 16 13:59:39.797003 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.796938 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/31e70218-76f5-466f-8893-9b596d11423e-image-registry-private-configuration\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.797003 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.796976 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/31e70218-76f5-466f-8893-9b596d11423e-ca-trust-extracted\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.797003 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.796990 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31e70218-76f5-466f-8893-9b596d11423e-trusted-ca\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.797129 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.797008 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwxx9\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-kube-api-access-vwxx9\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.797129 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.797031 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 13:59:39.797129 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.797051 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-config-volume\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 13:59:39.797129 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.797067 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.797129 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.797083 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/31e70218-76f5-466f-8893-9b596d11423e-registry-certificates\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.797129 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.797105 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/31e70218-76f5-466f-8893-9b596d11423e-installation-pull-secrets\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.797285 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.797134 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-tmp-dir\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 13:59:39.797285 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.797159 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-bound-sa-token\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.898240 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.898187 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/31e70218-76f5-466f-8893-9b596d11423e-image-registry-private-configuration\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.898240 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.898233 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/31e70218-76f5-466f-8893-9b596d11423e-ca-trust-extracted\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.898347 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.898254 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31e70218-76f5-466f-8893-9b596d11423e-trusted-ca\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.898347 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.898281 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwxx9\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-kube-api-access-vwxx9\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.898347 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.898304 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 13:59:39.898347 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.898331 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-config-volume\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 13:59:39.898527 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.898357 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.898527 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:39.898443 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:39.898527 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:39.898523 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls podName:0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:40.39849669 +0000 UTC m=+34.071397976 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls") pod "dns-default-8cbxc" (UID: "0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9") : secret "dns-default-metrics-tls" not found Apr 16 13:59:39.898672 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.898553 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/31e70218-76f5-466f-8893-9b596d11423e-registry-certificates\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.898672 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.898592 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/31e70218-76f5-466f-8893-9b596d11423e-installation-pull-secrets\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.898672 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.898631 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-tmp-dir\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 13:59:39.898672 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:39.898449 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:39.898672 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.898647 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/31e70218-76f5-466f-8893-9b596d11423e-ca-trust-extracted\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.898672 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:39.898662 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w: secret "image-registry-tls" not found Apr 16 13:59:39.898969 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.898699 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-bound-sa-token\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.898969 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:39.898738 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls podName:31e70218-76f5-466f-8893-9b596d11423e nodeName:}" failed. No retries permitted until 2026-04-16 13:59:40.398717384 +0000 UTC m=+34.071618682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls") pod "image-registry-6cb8b4dbdb-w7k9w" (UID: "31e70218-76f5-466f-8893-9b596d11423e") : secret "image-registry-tls" not found Apr 16 13:59:39.898969 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.898797 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d49qg\" (UniqueName: \"kubernetes.io/projected/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-kube-api-access-d49qg\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 13:59:39.898969 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.898837 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g445x\" (UniqueName: \"kubernetes.io/projected/41223147-714d-4ec2-a7b7-5febd776c247-kube-api-access-g445x\") pod \"ingress-canary-vf79m\" (UID: \"41223147-714d-4ec2-a7b7-5febd776c247\") " pod="openshift-ingress-canary/ingress-canary-vf79m" Apr 16 13:59:39.898969 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.898870 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert\") pod \"ingress-canary-vf79m\" (UID: \"41223147-714d-4ec2-a7b7-5febd776c247\") " pod="openshift-ingress-canary/ingress-canary-vf79m" Apr 16 13:59:39.899173 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:39.898977 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:39.899173 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.899000 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-config-volume\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 13:59:39.899173 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:39.899027 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert podName:41223147-714d-4ec2-a7b7-5febd776c247 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:40.399010872 +0000 UTC m=+34.071912158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert") pod "ingress-canary-vf79m" (UID: "41223147-714d-4ec2-a7b7-5febd776c247") : secret "canary-serving-cert" not found Apr 16 13:59:39.899265 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.899215 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-tmp-dir\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 13:59:39.899348 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.899328 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/31e70218-76f5-466f-8893-9b596d11423e-registry-certificates\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.899802 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.899784 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31e70218-76f5-466f-8893-9b596d11423e-trusted-ca\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.902638 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.902614 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/31e70218-76f5-466f-8893-9b596d11423e-image-registry-private-configuration\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.908161 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.908136 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/31e70218-76f5-466f-8893-9b596d11423e-installation-pull-secrets\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.909313 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.909293 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwxx9\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-kube-api-access-vwxx9\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.924940 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.924920 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-bound-sa-token\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:39.925086 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.925070 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g445x\" (UniqueName: \"kubernetes.io/projected/41223147-714d-4ec2-a7b7-5febd776c247-kube-api-access-g445x\") pod \"ingress-canary-vf79m\" (UID: \"41223147-714d-4ec2-a7b7-5febd776c247\") " pod="openshift-ingress-canary/ingress-canary-vf79m" Apr 16 13:59:39.925708 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:39.925668 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d49qg\" (UniqueName: \"kubernetes.io/projected/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-kube-api-access-d49qg\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 13:59:40.403035 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:40.402970 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert\") pod \"ingress-canary-vf79m\" (UID: \"41223147-714d-4ec2-a7b7-5febd776c247\") " pod="openshift-ingress-canary/ingress-canary-vf79m" Apr 16 13:59:40.403535 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:40.403036 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 13:59:40.403535 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:40.403057 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:40.403535 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:40.403138 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:40.403535 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:40.403159 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:40.403535 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:40.403170 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w: secret "image-registry-tls" not found Apr 16 13:59:40.403535 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:40.403171 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:40.403535 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:40.403202 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert podName:41223147-714d-4ec2-a7b7-5febd776c247 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:41.403185647 +0000 UTC m=+35.076086918 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert") pod "ingress-canary-vf79m" (UID: "41223147-714d-4ec2-a7b7-5febd776c247") : secret "canary-serving-cert" not found Apr 16 13:59:40.403535 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:40.403215 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls podName:31e70218-76f5-466f-8893-9b596d11423e nodeName:}" failed. No retries permitted until 2026-04-16 13:59:41.403209312 +0000 UTC m=+35.076110584 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls") pod "image-registry-6cb8b4dbdb-w7k9w" (UID: "31e70218-76f5-466f-8893-9b596d11423e") : secret "image-registry-tls" not found Apr 16 13:59:40.403535 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:40.403226 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls podName:0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:41.403220665 +0000 UTC m=+35.076121937 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls") pod "dns-default-8cbxc" (UID: "0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9") : secret "dns-default-metrics-tls" not found Apr 16 13:59:40.604731 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:40.604705 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs\") pod \"network-metrics-daemon-rt77p\" (UID: \"86d416f7-1028-4d19-9a65-2ecc6960eeb7\") " pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:40.604890 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:40.604776 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret\") pod \"global-pull-secret-syncer-hxjfz\" (UID: \"80f4a0f5-8232-4155-a115-e7470360cc63\") " pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:40.604890 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:40.604824 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:40.604890 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:40.604845 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:40.604890 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:40.604888 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret podName:80f4a0f5-8232-4155-a115-e7470360cc63 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:12.604875904 +0000 UTC m=+66.277777176 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret") pod "global-pull-secret-syncer-hxjfz" (UID: "80f4a0f5-8232-4155-a115-e7470360cc63") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:40.605025 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:40.604900 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs podName:86d416f7-1028-4d19-9a65-2ecc6960eeb7 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:12.604894364 +0000 UTC m=+66.277795636 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs") pod "network-metrics-daemon-rt77p" (UID: "86d416f7-1028-4d19-9a65-2ecc6960eeb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:40.705346 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:40.705289 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqxst\" (UniqueName: \"kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst\") pod \"network-check-target-qpb5w\" (UID: \"a0c4f1c8-43b6-4596-a619-0dd4cba798af\") " pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:40.705454 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:40.705421 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:40.705454 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:40.705438 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:40.705454 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:40.705447 2582 projected.go:194] Error preparing data for projected volume kube-api-access-wqxst for pod openshift-network-diagnostics/network-check-target-qpb5w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:40.705572 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:40.705492 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst podName:a0c4f1c8-43b6-4596-a619-0dd4cba798af nodeName:}" failed. No retries permitted until 2026-04-16 14:00:12.705475765 +0000 UTC m=+66.378377051 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-wqxst" (UniqueName: "kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst") pod "network-check-target-qpb5w" (UID: "a0c4f1c8-43b6-4596-a619-0dd4cba798af") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:40.937554 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:40.937530 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 13:59:40.937727 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:40.937530 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 13:59:40.937727 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:40.937529 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 13:59:40.940049 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:40.939993 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:59:40.940154 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:40.940067 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:59:40.940212 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:40.940191 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 13:59:40.940299 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:40.940283 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gs8rq\"" Apr 16 13:59:40.940384 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:40.940329 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:59:40.940384 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:40.940363 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x5z8x\"" Apr 16 13:59:41.048763 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:41.048739 2582 generic.go:358] "Generic (PLEG): container finished" podID="dfd38cb4-73f0-4cb1-a3ee-4f877e37742f" containerID="2eebb49614404d4bd89e33f73eb97ff18f172c77dc307bcbca3ecb0b7226ec2a" exitCode=0 Apr 16 13:59:41.048890 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:41.048798 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pjcmk" event={"ID":"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f","Type":"ContainerDied","Data":"2eebb49614404d4bd89e33f73eb97ff18f172c77dc307bcbca3ecb0b7226ec2a"} Apr 16 13:59:41.411109 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:41.411031 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert\") pod \"ingress-canary-vf79m\" (UID: \"41223147-714d-4ec2-a7b7-5febd776c247\") " pod="openshift-ingress-canary/ingress-canary-vf79m" Apr 16 13:59:41.411610 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:41.411173 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:41.411610 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:41.411193 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 13:59:41.411610 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:41.411227 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert podName:41223147-714d-4ec2-a7b7-5febd776c247 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:43.411213858 +0000 UTC m=+37.084115134 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert") pod "ingress-canary-vf79m" (UID: "41223147-714d-4ec2-a7b7-5febd776c247") : secret "canary-serving-cert" not found Apr 16 13:59:41.411610 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:41.411245 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:41.411610 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:41.411281 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:41.411610 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:41.411321 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls podName:0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:43.411310485 +0000 UTC m=+37.084211757 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls") pod "dns-default-8cbxc" (UID: "0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9") : secret "dns-default-metrics-tls" not found Apr 16 13:59:41.411610 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:41.411332 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:41.411610 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:41.411341 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w: secret "image-registry-tls" not found Apr 16 13:59:41.411610 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:41.411380 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls podName:31e70218-76f5-466f-8893-9b596d11423e nodeName:}" failed. No retries permitted until 2026-04-16 13:59:43.411370045 +0000 UTC m=+37.084271331 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls") pod "image-registry-6cb8b4dbdb-w7k9w" (UID: "31e70218-76f5-466f-8893-9b596d11423e") : secret "image-registry-tls" not found Apr 16 13:59:43.427796 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:43.427763 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert\") pod \"ingress-canary-vf79m\" (UID: \"41223147-714d-4ec2-a7b7-5febd776c247\") " pod="openshift-ingress-canary/ingress-canary-vf79m" Apr 16 13:59:43.428337 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:43.427841 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 13:59:43.428337 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:43.427873 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:43.428337 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:43.427906 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:43.428337 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:43.427970 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert podName:41223147-714d-4ec2-a7b7-5febd776c247 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:47.427953628 +0000 UTC m=+41.100854905 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert") pod "ingress-canary-vf79m" (UID: "41223147-714d-4ec2-a7b7-5febd776c247") : secret "canary-serving-cert" not found Apr 16 13:59:43.428337 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:43.427974 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:43.428337 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:43.427989 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w: secret "image-registry-tls" not found Apr 16 13:59:43.428337 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:43.427989 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:43.428337 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:43.428033 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls podName:31e70218-76f5-466f-8893-9b596d11423e nodeName:}" failed. No retries permitted until 2026-04-16 13:59:47.428017812 +0000 UTC m=+41.100919091 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls") pod "image-registry-6cb8b4dbdb-w7k9w" (UID: "31e70218-76f5-466f-8893-9b596d11423e") : secret "image-registry-tls" not found Apr 16 13:59:43.428337 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:43.428058 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls podName:0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:47.428048154 +0000 UTC m=+41.100949426 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls") pod "dns-default-8cbxc" (UID: "0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9") : secret "dns-default-metrics-tls" not found Apr 16 13:59:47.061671 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:47.061611 2582 generic.go:358] "Generic (PLEG): container finished" podID="dfd38cb4-73f0-4cb1-a3ee-4f877e37742f" containerID="f4288a72eeb01b874826c6b02e624f1810032fe24238939def160d03396a3f36" exitCode=0 Apr 16 13:59:47.061983 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:47.061667 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pjcmk" event={"ID":"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f","Type":"ContainerDied","Data":"f4288a72eeb01b874826c6b02e624f1810032fe24238939def160d03396a3f36"} Apr 16 13:59:47.459665 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:47.459604 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 13:59:47.459665 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:47.459637 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:47.459831 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:47.459758 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:47.459831 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:47.459813 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:47.459831 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:47.459824 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w: secret "image-registry-tls" not found Apr 16 13:59:47.459923 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:47.459814 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls podName:0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:55.459799957 +0000 UTC m=+49.132701229 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls") pod "dns-default-8cbxc" (UID: "0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9") : secret "dns-default-metrics-tls" not found Apr 16 13:59:47.459923 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:47.459870 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls podName:31e70218-76f5-466f-8893-9b596d11423e nodeName:}" failed. No retries permitted until 2026-04-16 13:59:55.459859283 +0000 UTC m=+49.132760555 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls") pod "image-registry-6cb8b4dbdb-w7k9w" (UID: "31e70218-76f5-466f-8893-9b596d11423e") : secret "image-registry-tls" not found Apr 16 13:59:47.459923 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:47.459881 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert\") pod \"ingress-canary-vf79m\" (UID: \"41223147-714d-4ec2-a7b7-5febd776c247\") " pod="openshift-ingress-canary/ingress-canary-vf79m" Apr 16 13:59:47.460024 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:47.459945 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:47.460024 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:47.459970 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert podName:41223147-714d-4ec2-a7b7-5febd776c247 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:55.4599637 +0000 UTC m=+49.132864972 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert") pod "ingress-canary-vf79m" (UID: "41223147-714d-4ec2-a7b7-5febd776c247") : secret "canary-serving-cert" not found Apr 16 13:59:48.068824 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:48.068781 2582 generic.go:358] "Generic (PLEG): container finished" podID="dfd38cb4-73f0-4cb1-a3ee-4f877e37742f" containerID="f0e053a0061dced31fb1f4e1f84bd5ba13f64838e24f374325f73f4610fc9008" exitCode=0 Apr 16 13:59:48.069256 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:48.068837 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pjcmk" event={"ID":"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f","Type":"ContainerDied","Data":"f0e053a0061dced31fb1f4e1f84bd5ba13f64838e24f374325f73f4610fc9008"} Apr 16 13:59:49.073058 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:49.073022 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pjcmk" event={"ID":"dfd38cb4-73f0-4cb1-a3ee-4f877e37742f","Type":"ContainerStarted","Data":"2499758dc7179d2d1cece5fef9a8b41f7141a05c647487cd4dd1d83a1dccd3de"} Apr 16 13:59:49.095050 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:49.095010 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-pjcmk" podStartSLOduration=4.471320537 podStartE2EDuration="42.094997078s" podCreationTimestamp="2026-04-16 13:59:07 +0000 UTC" firstStartedPulling="2026-04-16 13:59:09.160756398 +0000 UTC m=+2.833657685" lastFinishedPulling="2026-04-16 13:59:46.784432951 +0000 UTC m=+40.457334226" observedRunningTime="2026-04-16 13:59:49.094409084 +0000 UTC m=+42.767310389" watchObservedRunningTime="2026-04-16 13:59:49.094997078 +0000 UTC m=+42.767898371" Apr 16 13:59:55.514258 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:55.514221 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 13:59:55.514258 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:55.514263 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 13:59:55.514670 ip-10-0-128-29 kubenswrapper[2582]: I0416 13:59:55.514300 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert\") pod \"ingress-canary-vf79m\" (UID: \"41223147-714d-4ec2-a7b7-5febd776c247\") " pod="openshift-ingress-canary/ingress-canary-vf79m" Apr 16 13:59:55.514670 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:55.514385 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:55.514670 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:55.514385 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:55.514670 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:55.514420 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:55.514670 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:55.514435 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w: secret "image-registry-tls" not found Apr 16 13:59:55.514670 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:55.514439 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls podName:0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:11.514424161 +0000 UTC m=+65.187325437 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls") pod "dns-default-8cbxc" (UID: "0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9") : secret "dns-default-metrics-tls" not found Apr 16 13:59:55.514670 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:55.514454 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert podName:41223147-714d-4ec2-a7b7-5febd776c247 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:11.514446928 +0000 UTC m=+65.187348199 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert") pod "ingress-canary-vf79m" (UID: "41223147-714d-4ec2-a7b7-5febd776c247") : secret "canary-serving-cert" not found Apr 16 13:59:55.514670 ip-10-0-128-29 kubenswrapper[2582]: E0416 13:59:55.514476 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls podName:31e70218-76f5-466f-8893-9b596d11423e nodeName:}" failed. No retries permitted until 2026-04-16 14:00:11.514462396 +0000 UTC m=+65.187363668 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls") pod "image-registry-6cb8b4dbdb-w7k9w" (UID: "31e70218-76f5-466f-8893-9b596d11423e") : secret "image-registry-tls" not found Apr 16 14:00:05.045230 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:05.045201 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-np5bh" Apr 16 14:00:11.535352 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:11.535302 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert\") pod \"ingress-canary-vf79m\" (UID: \"41223147-714d-4ec2-a7b7-5febd776c247\") " pod="openshift-ingress-canary/ingress-canary-vf79m" Apr 16 14:00:11.535812 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:11.535448 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:11.535812 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:11.535491 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 14:00:11.535812 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:11.535527 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert podName:41223147-714d-4ec2-a7b7-5febd776c247 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:43.535512745 +0000 UTC m=+97.208414018 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert") pod "ingress-canary-vf79m" (UID: "41223147-714d-4ec2-a7b7-5febd776c247") : secret "canary-serving-cert" not found Apr 16 14:00:11.535812 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:11.535545 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 14:00:11.535812 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:11.535574 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:11.535812 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:11.535624 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:00:11.535812 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:11.535632 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w: secret "image-registry-tls" not found Apr 16 14:00:11.535812 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:11.535624 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls podName:0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:43.535613001 +0000 UTC m=+97.208514273 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls") pod "dns-default-8cbxc" (UID: "0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9") : secret "dns-default-metrics-tls" not found Apr 16 14:00:11.535812 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:11.535655 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls podName:31e70218-76f5-466f-8893-9b596d11423e nodeName:}" failed. No retries permitted until 2026-04-16 14:00:43.535648098 +0000 UTC m=+97.208549370 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls") pod "image-registry-6cb8b4dbdb-w7k9w" (UID: "31e70218-76f5-466f-8893-9b596d11423e") : secret "image-registry-tls" not found Apr 16 14:00:12.643730 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:12.643668 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret\") pod \"global-pull-secret-syncer-hxjfz\" (UID: \"80f4a0f5-8232-4155-a115-e7470360cc63\") " pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 14:00:12.644132 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:12.643768 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs\") pod \"network-metrics-daemon-rt77p\" (UID: \"86d416f7-1028-4d19-9a65-2ecc6960eeb7\") " pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 14:00:12.646126 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:12.646106 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:00:12.646288 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:12.646272 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:00:12.654586 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:12.654571 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:00:12.654627 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:12.654621 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs podName:86d416f7-1028-4d19-9a65-2ecc6960eeb7 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:16.654606156 +0000 UTC m=+130.327507428 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs") pod "network-metrics-daemon-rt77p" (UID: "86d416f7-1028-4d19-9a65-2ecc6960eeb7") : secret "metrics-daemon-secret" not found Apr 16 14:00:12.656673 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:12.656650 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/80f4a0f5-8232-4155-a115-e7470360cc63-original-pull-secret\") pod \"global-pull-secret-syncer-hxjfz\" (UID: \"80f4a0f5-8232-4155-a115-e7470360cc63\") " pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 14:00:12.744856 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:12.744816 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqxst\" (UniqueName: \"kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst\") pod \"network-check-target-qpb5w\" (UID: \"a0c4f1c8-43b6-4596-a619-0dd4cba798af\") " pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 14:00:12.746748 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:12.746732 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxjfz" Apr 16 14:00:12.747573 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:12.747426 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:00:12.756737 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:12.756717 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:00:12.768834 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:12.768803 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqxst\" (UniqueName: \"kubernetes.io/projected/a0c4f1c8-43b6-4596-a619-0dd4cba798af-kube-api-access-wqxst\") pod \"network-check-target-qpb5w\" (UID: \"a0c4f1c8-43b6-4596-a619-0dd4cba798af\") " pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 14:00:12.867359 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:12.867321 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hxjfz"] Apr 16 14:00:12.871231 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:00:12.871202 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80f4a0f5_8232_4155_a115_e7470360cc63.slice/crio-3baff9e304038eb6907ec8516be4412ab660818f85f9a57f30c15519ff9f000d WatchSource:0}: Error finding container 3baff9e304038eb6907ec8516be4412ab660818f85f9a57f30c15519ff9f000d: Status 404 returned error can't find the container with id 3baff9e304038eb6907ec8516be4412ab660818f85f9a57f30c15519ff9f000d Apr 16 14:00:13.058562 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:13.058540 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gs8rq\"" Apr 16 14:00:13.067260 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:13.067243 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 14:00:13.117602 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:13.117560 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hxjfz" event={"ID":"80f4a0f5-8232-4155-a115-e7470360cc63","Type":"ContainerStarted","Data":"3baff9e304038eb6907ec8516be4412ab660818f85f9a57f30c15519ff9f000d"} Apr 16 14:00:13.174349 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:13.174324 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qpb5w"] Apr 16 14:00:13.177861 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:00:13.177832 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0c4f1c8_43b6_4596_a619_0dd4cba798af.slice/crio-f7074439f0f5bf6759a9598ce912b67fabd1b65a72b0dec835d53bee5fc1fe81 WatchSource:0}: Error finding container f7074439f0f5bf6759a9598ce912b67fabd1b65a72b0dec835d53bee5fc1fe81: Status 404 returned error can't find the container with id f7074439f0f5bf6759a9598ce912b67fabd1b65a72b0dec835d53bee5fc1fe81 Apr 16 14:00:14.120378 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:14.120346 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qpb5w" event={"ID":"a0c4f1c8-43b6-4596-a619-0dd4cba798af","Type":"ContainerStarted","Data":"f7074439f0f5bf6759a9598ce912b67fabd1b65a72b0dec835d53bee5fc1fe81"} Apr 16 14:00:18.130369 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:18.130331 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qpb5w" event={"ID":"a0c4f1c8-43b6-4596-a619-0dd4cba798af","Type":"ContainerStarted","Data":"34fb65894ce8a77fc32119787479e4204df08a1a956c0a361b2fed608f4e276c"} Apr 16 14:00:18.130843 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:18.130399 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 14:00:18.131537 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:18.131516 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hxjfz" event={"ID":"80f4a0f5-8232-4155-a115-e7470360cc63","Type":"ContainerStarted","Data":"d69b6cb0d12052b111cb2b5bf095d83ca70e435ae46b8b123942e7c904709aaf"} Apr 16 14:00:18.145751 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:18.145703 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qpb5w" podStartSLOduration=66.812934977 podStartE2EDuration="1m11.14567698s" podCreationTimestamp="2026-04-16 13:59:07 +0000 UTC" firstStartedPulling="2026-04-16 14:00:13.179651631 +0000 UTC m=+66.852552903" lastFinishedPulling="2026-04-16 14:00:17.512393635 +0000 UTC m=+71.185294906" observedRunningTime="2026-04-16 14:00:18.145354666 +0000 UTC m=+71.818255962" watchObservedRunningTime="2026-04-16 14:00:18.14567698 +0000 UTC m=+71.818578274" Apr 16 14:00:43.561949 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:43.561912 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 14:00:43.561949 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:43.561953 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 14:00:43.562390 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:43.561999 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert\") pod \"ingress-canary-vf79m\" (UID: \"41223147-714d-4ec2-a7b7-5febd776c247\") " pod="openshift-ingress-canary/ingress-canary-vf79m" Apr 16 14:00:43.562390 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:43.562070 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:43.562390 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:43.562092 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:43.562390 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:43.562130 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:00:43.562390 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:43.562144 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls podName:0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:47.562124877 +0000 UTC m=+161.235026156 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls") pod "dns-default-8cbxc" (UID: "0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9") : secret "dns-default-metrics-tls" not found Apr 16 14:00:43.562390 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:43.562148 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w: secret "image-registry-tls" not found Apr 16 14:00:43.562390 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:43.562158 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert podName:41223147-714d-4ec2-a7b7-5febd776c247 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:47.562152172 +0000 UTC m=+161.235053447 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert") pod "ingress-canary-vf79m" (UID: "41223147-714d-4ec2-a7b7-5febd776c247") : secret "canary-serving-cert" not found Apr 16 14:00:43.562390 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:43.562202 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls podName:31e70218-76f5-466f-8893-9b596d11423e nodeName:}" failed. No retries permitted until 2026-04-16 14:01:47.5621874 +0000 UTC m=+161.235088684 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls") pod "image-registry-6cb8b4dbdb-w7k9w" (UID: "31e70218-76f5-466f-8893-9b596d11423e") : secret "image-registry-tls" not found Apr 16 14:00:49.135292 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:49.135261 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qpb5w" Apr 16 14:00:49.150205 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:49.150149 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hxjfz" podStartSLOduration=97.510622003 podStartE2EDuration="1m42.150136814s" podCreationTimestamp="2026-04-16 13:59:07 +0000 UTC" firstStartedPulling="2026-04-16 14:00:12.872885958 +0000 UTC m=+66.545787230" lastFinishedPulling="2026-04-16 14:00:17.512400765 +0000 UTC m=+71.185302041" observedRunningTime="2026-04-16 14:00:18.161065485 +0000 UTC m=+71.833966790" watchObservedRunningTime="2026-04-16 14:00:49.150136814 +0000 UTC m=+102.823038108" Apr 16 14:00:57.636646 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.636608 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj"] Apr 16 14:00:57.639115 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.639099 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj" Apr 16 14:00:57.641699 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.641631 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl"] Apr 16 14:00:57.642669 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.642648 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 14:00:57.642879 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.642860 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 14:00:57.643273 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.643256 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-jc9ls\"" Apr 16 14:00:57.643372 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.643355 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:00:57.643496 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.643480 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 14:00:57.643570 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.643556 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl" Apr 16 14:00:57.648027 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.648008 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 14:00:57.648027 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.648022 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:00:57.648178 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.648022 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 14:00:57.648395 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.648376 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-qjlkm\"" Apr 16 14:00:57.656510 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.656492 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj"] Apr 16 14:00:57.663949 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.663930 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl"] Apr 16 14:00:57.754032 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.754008 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f724d942-1eee-4167-a883-bbc5be00af26-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-h82nj\" (UID: \"f724d942-1eee-4167-a883-bbc5be00af26\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj" Apr 16 14:00:57.754134 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.754037 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zdd7\" (UniqueName: \"kubernetes.io/projected/600f68f2-3105-4729-b13b-e751d267b797-kube-api-access-2zdd7\") pod \"cluster-samples-operator-667775844f-92svl\" (UID: \"600f68f2-3105-4729-b13b-e751d267b797\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl" Apr 16 14:00:57.754134 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.754110 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/600f68f2-3105-4729-b13b-e751d267b797-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-92svl\" (UID: \"600f68f2-3105-4729-b13b-e751d267b797\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl" Apr 16 14:00:57.754134 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.754130 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f724d942-1eee-4167-a883-bbc5be00af26-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-h82nj\" (UID: \"f724d942-1eee-4167-a883-bbc5be00af26\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj" Apr 16 14:00:57.754254 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.754148 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h4rv\" (UniqueName: \"kubernetes.io/projected/f724d942-1eee-4167-a883-bbc5be00af26-kube-api-access-9h4rv\") pod \"kube-storage-version-migrator-operator-756bb7d76f-h82nj\" (UID: \"f724d942-1eee-4167-a883-bbc5be00af26\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj" Apr 16 14:00:57.854657 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.854625 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/600f68f2-3105-4729-b13b-e751d267b797-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-92svl\" (UID: \"600f68f2-3105-4729-b13b-e751d267b797\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl" Apr 16 14:00:57.854657 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.854657 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f724d942-1eee-4167-a883-bbc5be00af26-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-h82nj\" (UID: \"f724d942-1eee-4167-a883-bbc5be00af26\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj" Apr 16 14:00:57.854880 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.854694 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9h4rv\" (UniqueName: \"kubernetes.io/projected/f724d942-1eee-4167-a883-bbc5be00af26-kube-api-access-9h4rv\") pod \"kube-storage-version-migrator-operator-756bb7d76f-h82nj\" (UID: \"f724d942-1eee-4167-a883-bbc5be00af26\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj" Apr 16 14:00:57.854880 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.854716 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f724d942-1eee-4167-a883-bbc5be00af26-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-h82nj\" (UID: \"f724d942-1eee-4167-a883-bbc5be00af26\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj" Apr 16 14:00:57.854880 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.854736 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zdd7\" (UniqueName: \"kubernetes.io/projected/600f68f2-3105-4729-b13b-e751d267b797-kube-api-access-2zdd7\") pod \"cluster-samples-operator-667775844f-92svl\" (UID: \"600f68f2-3105-4729-b13b-e751d267b797\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl" Apr 16 14:00:57.854880 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:57.854785 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:00:57.854880 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:57.854854 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/600f68f2-3105-4729-b13b-e751d267b797-samples-operator-tls podName:600f68f2-3105-4729-b13b-e751d267b797 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:58.354837826 +0000 UTC m=+112.027739103 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/600f68f2-3105-4729-b13b-e751d267b797-samples-operator-tls") pod "cluster-samples-operator-667775844f-92svl" (UID: "600f68f2-3105-4729-b13b-e751d267b797") : secret "samples-operator-tls" not found Apr 16 14:00:57.855263 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.855239 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f724d942-1eee-4167-a883-bbc5be00af26-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-h82nj\" (UID: \"f724d942-1eee-4167-a883-bbc5be00af26\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj" Apr 16 14:00:57.856870 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.856852 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f724d942-1eee-4167-a883-bbc5be00af26-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-h82nj\" (UID: \"f724d942-1eee-4167-a883-bbc5be00af26\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj" Apr 16 14:00:57.868835 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.868817 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h4rv\" (UniqueName: \"kubernetes.io/projected/f724d942-1eee-4167-a883-bbc5be00af26-kube-api-access-9h4rv\") pod \"kube-storage-version-migrator-operator-756bb7d76f-h82nj\" (UID: \"f724d942-1eee-4167-a883-bbc5be00af26\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj" Apr 16 14:00:57.880737 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.880711 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zdd7\" (UniqueName: \"kubernetes.io/projected/600f68f2-3105-4729-b13b-e751d267b797-kube-api-access-2zdd7\") pod \"cluster-samples-operator-667775844f-92svl\" (UID: \"600f68f2-3105-4729-b13b-e751d267b797\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl" Apr 16 14:00:57.949649 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:57.949603 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj" Apr 16 14:00:58.059904 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:58.059874 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj"] Apr 16 14:00:58.063179 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:00:58.063152 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf724d942_1eee_4167_a883_bbc5be00af26.slice/crio-8e83f7de612a9515f6c05cad1306a697d71ad247d1923e5e9b3e2969cdc0045f WatchSource:0}: Error finding container 8e83f7de612a9515f6c05cad1306a697d71ad247d1923e5e9b3e2969cdc0045f: Status 404 returned error can't find the container with id 8e83f7de612a9515f6c05cad1306a697d71ad247d1923e5e9b3e2969cdc0045f Apr 16 14:00:58.210249 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:58.210175 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj" event={"ID":"f724d942-1eee-4167-a883-bbc5be00af26","Type":"ContainerStarted","Data":"8e83f7de612a9515f6c05cad1306a697d71ad247d1923e5e9b3e2969cdc0045f"} Apr 16 14:00:58.358187 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:58.358159 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/600f68f2-3105-4729-b13b-e751d267b797-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-92svl\" (UID: \"600f68f2-3105-4729-b13b-e751d267b797\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl" Apr 16 14:00:58.358341 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:58.358318 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:00:58.358394 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:58.358384 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/600f68f2-3105-4729-b13b-e751d267b797-samples-operator-tls podName:600f68f2-3105-4729-b13b-e751d267b797 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:59.358368689 +0000 UTC m=+113.031269961 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/600f68f2-3105-4729-b13b-e751d267b797-samples-operator-tls") pod "cluster-samples-operator-667775844f-92svl" (UID: "600f68f2-3105-4729-b13b-e751d267b797") : secret "samples-operator-tls" not found Apr 16 14:00:59.367607 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:00:59.367570 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/600f68f2-3105-4729-b13b-e751d267b797-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-92svl\" (UID: \"600f68f2-3105-4729-b13b-e751d267b797\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl" Apr 16 14:00:59.368033 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:59.367722 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:00:59.368033 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:00:59.367785 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/600f68f2-3105-4729-b13b-e751d267b797-samples-operator-tls podName:600f68f2-3105-4729-b13b-e751d267b797 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:01.367770107 +0000 UTC m=+115.040671382 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/600f68f2-3105-4729-b13b-e751d267b797-samples-operator-tls") pod "cluster-samples-operator-667775844f-92svl" (UID: "600f68f2-3105-4729-b13b-e751d267b797") : secret "samples-operator-tls" not found Apr 16 14:01:00.215779 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:00.215747 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj" event={"ID":"f724d942-1eee-4167-a883-bbc5be00af26","Type":"ContainerStarted","Data":"6f6d25644c2b3dac139ddf44d1c2053761b17d246940a7bac18d6cf315890566"} Apr 16 14:01:00.232619 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:00.232567 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj" podStartSLOduration=1.626333354 podStartE2EDuration="3.232554507s" podCreationTimestamp="2026-04-16 14:00:57 +0000 UTC" firstStartedPulling="2026-04-16 14:00:58.065134897 +0000 UTC m=+111.738036175" lastFinishedPulling="2026-04-16 14:00:59.671356048 +0000 UTC m=+113.344257328" observedRunningTime="2026-04-16 14:01:00.231945289 +0000 UTC m=+113.904846596" watchObservedRunningTime="2026-04-16 14:01:00.232554507 +0000 UTC m=+113.905455801" Apr 16 14:01:01.382760 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:01.382724 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/600f68f2-3105-4729-b13b-e751d267b797-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-92svl\" (UID: \"600f68f2-3105-4729-b13b-e751d267b797\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl" Apr 16 14:01:01.383180 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:01:01.382848 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:01:01.383180 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:01:01.382900 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/600f68f2-3105-4729-b13b-e751d267b797-samples-operator-tls podName:600f68f2-3105-4729-b13b-e751d267b797 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:05.382886614 +0000 UTC m=+119.055787889 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/600f68f2-3105-4729-b13b-e751d267b797-samples-operator-tls") pod "cluster-samples-operator-667775844f-92svl" (UID: "600f68f2-3105-4729-b13b-e751d267b797") : secret "samples-operator-tls" not found Apr 16 14:01:04.264937 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.264905 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-xrdcd"] Apr 16 14:01:04.270152 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.270128 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-xrdcd" Apr 16 14:01:04.272108 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.272090 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 14:01:04.272212 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.272092 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 14:01:04.272441 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.272424 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-dwmzc\"" Apr 16 14:01:04.272484 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.272440 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 14:01:04.272519 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.272490 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 14:01:04.275984 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.275961 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-xrdcd"] Apr 16 14:01:04.405096 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.405053 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2dcd4b64-3e2a-4458-8fa1-28c40f582f29-signing-key\") pod \"service-ca-bfc587fb7-xrdcd\" (UID: \"2dcd4b64-3e2a-4458-8fa1-28c40f582f29\") " pod="openshift-service-ca/service-ca-bfc587fb7-xrdcd" Apr 16 14:01:04.405096 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.405094 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bqmw\" (UniqueName: \"kubernetes.io/projected/2dcd4b64-3e2a-4458-8fa1-28c40f582f29-kube-api-access-5bqmw\") pod \"service-ca-bfc587fb7-xrdcd\" (UID: \"2dcd4b64-3e2a-4458-8fa1-28c40f582f29\") " pod="openshift-service-ca/service-ca-bfc587fb7-xrdcd" Apr 16 14:01:04.405323 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.405152 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2dcd4b64-3e2a-4458-8fa1-28c40f582f29-signing-cabundle\") pod \"service-ca-bfc587fb7-xrdcd\" (UID: \"2dcd4b64-3e2a-4458-8fa1-28c40f582f29\") " pod="openshift-service-ca/service-ca-bfc587fb7-xrdcd" Apr 16 14:01:04.506227 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.506185 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2dcd4b64-3e2a-4458-8fa1-28c40f582f29-signing-key\") pod \"service-ca-bfc587fb7-xrdcd\" (UID: \"2dcd4b64-3e2a-4458-8fa1-28c40f582f29\") " pod="openshift-service-ca/service-ca-bfc587fb7-xrdcd" Apr 16 14:01:04.506227 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.506223 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bqmw\" (UniqueName: \"kubernetes.io/projected/2dcd4b64-3e2a-4458-8fa1-28c40f582f29-kube-api-access-5bqmw\") pod \"service-ca-bfc587fb7-xrdcd\" (UID: \"2dcd4b64-3e2a-4458-8fa1-28c40f582f29\") " pod="openshift-service-ca/service-ca-bfc587fb7-xrdcd" Apr 16 14:01:04.506469 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.506279 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2dcd4b64-3e2a-4458-8fa1-28c40f582f29-signing-cabundle\") pod \"service-ca-bfc587fb7-xrdcd\" (UID: \"2dcd4b64-3e2a-4458-8fa1-28c40f582f29\") " pod="openshift-service-ca/service-ca-bfc587fb7-xrdcd" Apr 16 14:01:04.507024 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.507004 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2dcd4b64-3e2a-4458-8fa1-28c40f582f29-signing-cabundle\") pod \"service-ca-bfc587fb7-xrdcd\" (UID: \"2dcd4b64-3e2a-4458-8fa1-28c40f582f29\") " pod="openshift-service-ca/service-ca-bfc587fb7-xrdcd" Apr 16 14:01:04.508548 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.508529 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2dcd4b64-3e2a-4458-8fa1-28c40f582f29-signing-key\") pod \"service-ca-bfc587fb7-xrdcd\" (UID: \"2dcd4b64-3e2a-4458-8fa1-28c40f582f29\") " pod="openshift-service-ca/service-ca-bfc587fb7-xrdcd" Apr 16 14:01:04.516452 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.516389 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bqmw\" (UniqueName: \"kubernetes.io/projected/2dcd4b64-3e2a-4458-8fa1-28c40f582f29-kube-api-access-5bqmw\") pod \"service-ca-bfc587fb7-xrdcd\" (UID: \"2dcd4b64-3e2a-4458-8fa1-28c40f582f29\") " pod="openshift-service-ca/service-ca-bfc587fb7-xrdcd" Apr 16 14:01:04.579393 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.579366 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-xrdcd" Apr 16 14:01:04.691627 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:04.691596 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-xrdcd"] Apr 16 14:01:04.694530 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:01:04.694503 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dcd4b64_3e2a_4458_8fa1_28c40f582f29.slice/crio-ccbb26b2ffd23a34cf2ef6f27db260dd4b2b143ef7dcd13af461aef29927d81f WatchSource:0}: Error finding container ccbb26b2ffd23a34cf2ef6f27db260dd4b2b143ef7dcd13af461aef29927d81f: Status 404 returned error can't find the container with id ccbb26b2ffd23a34cf2ef6f27db260dd4b2b143ef7dcd13af461aef29927d81f Apr 16 14:01:05.225558 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:05.225519 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-xrdcd" event={"ID":"2dcd4b64-3e2a-4458-8fa1-28c40f582f29","Type":"ContainerStarted","Data":"ccbb26b2ffd23a34cf2ef6f27db260dd4b2b143ef7dcd13af461aef29927d81f"} Apr 16 14:01:05.412821 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:05.412783 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/600f68f2-3105-4729-b13b-e751d267b797-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-92svl\" (UID: \"600f68f2-3105-4729-b13b-e751d267b797\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl" Apr 16 14:01:05.413237 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:01:05.412930 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:01:05.413237 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:01:05.412990 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/600f68f2-3105-4729-b13b-e751d267b797-samples-operator-tls podName:600f68f2-3105-4729-b13b-e751d267b797 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:13.412973546 +0000 UTC m=+127.085874819 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/600f68f2-3105-4729-b13b-e751d267b797-samples-operator-tls") pod "cluster-samples-operator-667775844f-92svl" (UID: "600f68f2-3105-4729-b13b-e751d267b797") : secret "samples-operator-tls" not found Apr 16 14:01:05.445759 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:05.445731 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ztqms_1c601f7b-2758-4b61-a47e-bdc41ba6fb31/dns-node-resolver/0.log" Apr 16 14:01:06.039402 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:06.039373 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sqfgf_56fe5eb2-ae67-4d8a-a719-f51bf68da0d0/node-ca/0.log" Apr 16 14:01:07.231120 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:07.231082 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-xrdcd" event={"ID":"2dcd4b64-3e2a-4458-8fa1-28c40f582f29","Type":"ContainerStarted","Data":"944b9ddb38e15ca94a3e7620950157d7fb0515a8023a7e65dbee0154dfcf77fc"} Apr 16 14:01:07.250151 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:07.250091 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-xrdcd" podStartSLOduration=1.6086102009999999 podStartE2EDuration="3.250076787s" podCreationTimestamp="2026-04-16 14:01:04 +0000 UTC" firstStartedPulling="2026-04-16 14:01:04.696336882 +0000 UTC m=+118.369238153" lastFinishedPulling="2026-04-16 14:01:06.337803453 +0000 UTC m=+120.010704739" observedRunningTime="2026-04-16 14:01:07.248161454 +0000 UTC m=+120.921062758" watchObservedRunningTime="2026-04-16 14:01:07.250076787 +0000 UTC m=+120.922978080" Apr 16 14:01:07.843425 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:07.843394 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-h82nj_f724d942-1eee-4167-a883-bbc5be00af26/kube-storage-version-migrator-operator/0.log" Apr 16 14:01:13.483945 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:13.483899 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/600f68f2-3105-4729-b13b-e751d267b797-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-92svl\" (UID: \"600f68f2-3105-4729-b13b-e751d267b797\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl" Apr 16 14:01:13.486265 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:13.486236 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/600f68f2-3105-4729-b13b-e751d267b797-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-92svl\" (UID: \"600f68f2-3105-4729-b13b-e751d267b797\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl" Apr 16 14:01:13.556167 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:13.556137 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-qjlkm\"" Apr 16 14:01:13.565388 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:13.565360 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl" Apr 16 14:01:13.692194 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:13.692161 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl"] Apr 16 14:01:14.246632 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:14.246589 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl" event={"ID":"600f68f2-3105-4729-b13b-e751d267b797","Type":"ContainerStarted","Data":"79bd30d9d5275934fafe30709a83ca742eb53790a06366d44338d92def4e519c"} Apr 16 14:01:16.255287 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:16.255246 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl" event={"ID":"600f68f2-3105-4729-b13b-e751d267b797","Type":"ContainerStarted","Data":"b889f50ddcc2af58d1560bdcbda5a3c4b876fc9dc9267f0538c162cedb5c19cc"} Apr 16 14:01:16.255287 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:16.255291 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl" event={"ID":"600f68f2-3105-4729-b13b-e751d267b797","Type":"ContainerStarted","Data":"bf832a2e5c9f173928c55c358704da8832dd6de2ff422d52b39a1825a30988f7"} Apr 16 14:01:16.275895 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:16.275848 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-92svl" podStartSLOduration=17.515284818 podStartE2EDuration="19.275831411s" podCreationTimestamp="2026-04-16 14:00:57 +0000 UTC" firstStartedPulling="2026-04-16 14:01:13.73440144 +0000 UTC m=+127.407302714" lastFinishedPulling="2026-04-16 14:01:15.494948035 +0000 UTC m=+129.167849307" observedRunningTime="2026-04-16 14:01:16.275466444 +0000 UTC m=+129.948367738" watchObservedRunningTime="2026-04-16 14:01:16.275831411 +0000 UTC m=+129.948732704" Apr 16 14:01:16.708678 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:16.708578 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs\") pod \"network-metrics-daemon-rt77p\" (UID: \"86d416f7-1028-4d19-9a65-2ecc6960eeb7\") " pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 14:01:16.710937 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:16.710909 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d416f7-1028-4d19-9a65-2ecc6960eeb7-metrics-certs\") pod \"network-metrics-daemon-rt77p\" (UID: \"86d416f7-1028-4d19-9a65-2ecc6960eeb7\") " pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 14:01:16.958047 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:16.954838 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x5z8x\"" Apr 16 14:01:16.962881 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:16.962851 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rt77p" Apr 16 14:01:17.093603 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:17.093566 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rt77p"] Apr 16 14:01:17.096871 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:01:17.096842 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86d416f7_1028_4d19_9a65_2ecc6960eeb7.slice/crio-264f34f8c6f08f4deda3b3351e369fab9418135855f735a6c465354c94d208bd WatchSource:0}: Error finding container 264f34f8c6f08f4deda3b3351e369fab9418135855f735a6c465354c94d208bd: Status 404 returned error can't find the container with id 264f34f8c6f08f4deda3b3351e369fab9418135855f735a6c465354c94d208bd Apr 16 14:01:17.259057 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:17.258974 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rt77p" event={"ID":"86d416f7-1028-4d19-9a65-2ecc6960eeb7","Type":"ContainerStarted","Data":"264f34f8c6f08f4deda3b3351e369fab9418135855f735a6c465354c94d208bd"} Apr 16 14:01:19.266296 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:19.266258 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rt77p" event={"ID":"86d416f7-1028-4d19-9a65-2ecc6960eeb7","Type":"ContainerStarted","Data":"3ae246a41b9035c56e462e5ab783111709b0050e9d9d4f1190f69e9dc0b2e6bd"} Apr 16 14:01:19.266296 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:19.266296 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rt77p" event={"ID":"86d416f7-1028-4d19-9a65-2ecc6960eeb7","Type":"ContainerStarted","Data":"e467db4874bac172e40f571ee8d2a7c65cb0a3dfee57ebbf9121432ee4d03ce5"} Apr 16 14:01:19.281536 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:19.281498 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rt77p" podStartSLOduration=131.186067635 podStartE2EDuration="2m12.28148396s" podCreationTimestamp="2026-04-16 13:59:07 +0000 UTC" firstStartedPulling="2026-04-16 14:01:17.098743959 +0000 UTC m=+130.771645233" lastFinishedPulling="2026-04-16 14:01:18.194160286 +0000 UTC m=+131.867061558" observedRunningTime="2026-04-16 14:01:19.281022391 +0000 UTC m=+132.953923684" watchObservedRunningTime="2026-04-16 14:01:19.28148396 +0000 UTC m=+132.954385253" Apr 16 14:01:23.708141 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.708107 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gpnpx"] Apr 16 14:01:23.711379 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.711361 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gpnpx" Apr 16 14:01:23.714467 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.714441 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:01:23.714587 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.714444 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:01:23.714587 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.714445 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:01:23.714587 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.714522 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:01:23.714587 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.714536 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-g94nx\"" Apr 16 14:01:23.725089 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.725069 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gpnpx"] Apr 16 14:01:23.753734 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.753710 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/02fa5e24-8818-4d97-9a44-c85c3daf42a9-data-volume\") pod \"insights-runtime-extractor-gpnpx\" (UID: \"02fa5e24-8818-4d97-9a44-c85c3daf42a9\") " pod="openshift-insights/insights-runtime-extractor-gpnpx" Apr 16 14:01:23.753854 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.753744 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/02fa5e24-8818-4d97-9a44-c85c3daf42a9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gpnpx\" (UID: \"02fa5e24-8818-4d97-9a44-c85c3daf42a9\") " pod="openshift-insights/insights-runtime-extractor-gpnpx" Apr 16 14:01:23.753854 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.753786 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/02fa5e24-8818-4d97-9a44-c85c3daf42a9-crio-socket\") pod \"insights-runtime-extractor-gpnpx\" (UID: \"02fa5e24-8818-4d97-9a44-c85c3daf42a9\") " pod="openshift-insights/insights-runtime-extractor-gpnpx" Apr 16 14:01:23.753854 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.753825 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/02fa5e24-8818-4d97-9a44-c85c3daf42a9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gpnpx\" (UID: \"02fa5e24-8818-4d97-9a44-c85c3daf42a9\") " pod="openshift-insights/insights-runtime-extractor-gpnpx" Apr 16 14:01:23.753854 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.753842 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6t5l\" (UniqueName: \"kubernetes.io/projected/02fa5e24-8818-4d97-9a44-c85c3daf42a9-kube-api-access-p6t5l\") pod \"insights-runtime-extractor-gpnpx\" (UID: \"02fa5e24-8818-4d97-9a44-c85c3daf42a9\") " pod="openshift-insights/insights-runtime-extractor-gpnpx" Apr 16 14:01:23.854298 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.854264 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/02fa5e24-8818-4d97-9a44-c85c3daf42a9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gpnpx\" (UID: \"02fa5e24-8818-4d97-9a44-c85c3daf42a9\") " pod="openshift-insights/insights-runtime-extractor-gpnpx" Apr 16 14:01:23.854453 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.854325 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/02fa5e24-8818-4d97-9a44-c85c3daf42a9-crio-socket\") pod \"insights-runtime-extractor-gpnpx\" (UID: \"02fa5e24-8818-4d97-9a44-c85c3daf42a9\") " pod="openshift-insights/insights-runtime-extractor-gpnpx" Apr 16 14:01:23.854453 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.854351 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/02fa5e24-8818-4d97-9a44-c85c3daf42a9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gpnpx\" (UID: \"02fa5e24-8818-4d97-9a44-c85c3daf42a9\") " pod="openshift-insights/insights-runtime-extractor-gpnpx" Apr 16 14:01:23.854453 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.854370 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6t5l\" (UniqueName: \"kubernetes.io/projected/02fa5e24-8818-4d97-9a44-c85c3daf42a9-kube-api-access-p6t5l\") pod \"insights-runtime-extractor-gpnpx\" (UID: \"02fa5e24-8818-4d97-9a44-c85c3daf42a9\") " pod="openshift-insights/insights-runtime-extractor-gpnpx" Apr 16 14:01:23.854641 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.854444 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/02fa5e24-8818-4d97-9a44-c85c3daf42a9-crio-socket\") pod \"insights-runtime-extractor-gpnpx\" (UID: \"02fa5e24-8818-4d97-9a44-c85c3daf42a9\") " pod="openshift-insights/insights-runtime-extractor-gpnpx" Apr 16 14:01:23.854641 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.854530 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/02fa5e24-8818-4d97-9a44-c85c3daf42a9-data-volume\") pod \"insights-runtime-extractor-gpnpx\" (UID: \"02fa5e24-8818-4d97-9a44-c85c3daf42a9\") " pod="openshift-insights/insights-runtime-extractor-gpnpx" Apr 16 14:01:23.854805 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.854788 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/02fa5e24-8818-4d97-9a44-c85c3daf42a9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gpnpx\" (UID: \"02fa5e24-8818-4d97-9a44-c85c3daf42a9\") " pod="openshift-insights/insights-runtime-extractor-gpnpx" Apr 16 14:01:23.854848 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.854830 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/02fa5e24-8818-4d97-9a44-c85c3daf42a9-data-volume\") pod \"insights-runtime-extractor-gpnpx\" (UID: \"02fa5e24-8818-4d97-9a44-c85c3daf42a9\") " pod="openshift-insights/insights-runtime-extractor-gpnpx" Apr 16 14:01:23.857006 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.856991 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/02fa5e24-8818-4d97-9a44-c85c3daf42a9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gpnpx\" (UID: \"02fa5e24-8818-4d97-9a44-c85c3daf42a9\") " pod="openshift-insights/insights-runtime-extractor-gpnpx" Apr 16 14:01:23.870008 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:23.869981 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6t5l\" (UniqueName: \"kubernetes.io/projected/02fa5e24-8818-4d97-9a44-c85c3daf42a9-kube-api-access-p6t5l\") pod \"insights-runtime-extractor-gpnpx\" (UID: \"02fa5e24-8818-4d97-9a44-c85c3daf42a9\") " pod="openshift-insights/insights-runtime-extractor-gpnpx" Apr 16 14:01:24.019801 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:24.019779 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gpnpx" Apr 16 14:01:24.195543 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:24.195465 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gpnpx"] Apr 16 14:01:24.196106 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:01:24.196073 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02fa5e24_8818_4d97_9a44_c85c3daf42a9.slice/crio-c4714828ebdea94916f9890abbb40f5564c02fdeca71f01c50517b0eebfaf8da WatchSource:0}: Error finding container c4714828ebdea94916f9890abbb40f5564c02fdeca71f01c50517b0eebfaf8da: Status 404 returned error can't find the container with id c4714828ebdea94916f9890abbb40f5564c02fdeca71f01c50517b0eebfaf8da Apr 16 14:01:24.281196 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:24.281122 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gpnpx" event={"ID":"02fa5e24-8818-4d97-9a44-c85c3daf42a9","Type":"ContainerStarted","Data":"24349871962b74f78e52085e0702c0ef81c0aa9b1494d1e40b2a7c8172da95d7"} Apr 16 14:01:24.281196 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:24.281157 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gpnpx" event={"ID":"02fa5e24-8818-4d97-9a44-c85c3daf42a9","Type":"ContainerStarted","Data":"c4714828ebdea94916f9890abbb40f5564c02fdeca71f01c50517b0eebfaf8da"} Apr 16 14:01:27.290004 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:27.289970 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gpnpx" event={"ID":"02fa5e24-8818-4d97-9a44-c85c3daf42a9","Type":"ContainerStarted","Data":"395dbe2aae216dff973cfaf24f45cc649179d4f85678c9e8fefa8aed5052a697"} Apr 16 14:01:29.295787 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:29.295752 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gpnpx" event={"ID":"02fa5e24-8818-4d97-9a44-c85c3daf42a9","Type":"ContainerStarted","Data":"f4957d1f06a192c8277e1ec65021bed75a2885cd7c04c4935c14404c2d7e48df"} Apr 16 14:01:29.314768 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:29.314722 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gpnpx" podStartSLOduration=1.658739043 podStartE2EDuration="6.314707918s" podCreationTimestamp="2026-04-16 14:01:23 +0000 UTC" firstStartedPulling="2026-04-16 14:01:24.247709604 +0000 UTC m=+137.920610876" lastFinishedPulling="2026-04-16 14:01:28.903678477 +0000 UTC m=+142.576579751" observedRunningTime="2026-04-16 14:01:29.313359115 +0000 UTC m=+142.986260408" watchObservedRunningTime="2026-04-16 14:01:29.314707918 +0000 UTC m=+142.987609215" Apr 16 14:01:32.145347 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:32.145312 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4bp2n"] Apr 16 14:01:32.148467 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:32.148449 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4bp2n" Apr 16 14:01:32.151883 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:32.151856 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-5j9xq\"" Apr 16 14:01:32.151997 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:32.151944 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 14:01:32.162353 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:32.162330 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4bp2n"] Apr 16 14:01:32.213774 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:32.213735 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e8c815b4-e7d9-4b96-a516-7a00cc1a2578-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-4bp2n\" (UID: \"e8c815b4-e7d9-4b96-a516-7a00cc1a2578\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4bp2n" Apr 16 14:01:32.314767 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:32.314734 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e8c815b4-e7d9-4b96-a516-7a00cc1a2578-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-4bp2n\" (UID: \"e8c815b4-e7d9-4b96-a516-7a00cc1a2578\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4bp2n" Apr 16 14:01:32.314901 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:01:32.314869 2582 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 14:01:32.314956 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:01:32.314924 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8c815b4-e7d9-4b96-a516-7a00cc1a2578-tls-certificates podName:e8c815b4-e7d9-4b96-a516-7a00cc1a2578 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:32.814909184 +0000 UTC m=+146.487810460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/e8c815b4-e7d9-4b96-a516-7a00cc1a2578-tls-certificates") pod "prometheus-operator-admission-webhook-9cb97cd87-4bp2n" (UID: "e8c815b4-e7d9-4b96-a516-7a00cc1a2578") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 14:01:32.817847 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:32.817802 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e8c815b4-e7d9-4b96-a516-7a00cc1a2578-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-4bp2n\" (UID: \"e8c815b4-e7d9-4b96-a516-7a00cc1a2578\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4bp2n" Apr 16 14:01:32.820096 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:32.820070 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e8c815b4-e7d9-4b96-a516-7a00cc1a2578-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-4bp2n\" (UID: \"e8c815b4-e7d9-4b96-a516-7a00cc1a2578\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4bp2n" Apr 16 14:01:33.062537 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:33.062506 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4bp2n" Apr 16 14:01:33.172228 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:33.172154 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4bp2n"] Apr 16 14:01:33.174574 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:01:33.174550 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8c815b4_e7d9_4b96_a516_7a00cc1a2578.slice/crio-4d98df89564c34202eb55889c980b7e1a4e05a9468833306617e20e166bced40 WatchSource:0}: Error finding container 4d98df89564c34202eb55889c980b7e1a4e05a9468833306617e20e166bced40: Status 404 returned error can't find the container with id 4d98df89564c34202eb55889c980b7e1a4e05a9468833306617e20e166bced40 Apr 16 14:01:33.306156 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:33.306120 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4bp2n" event={"ID":"e8c815b4-e7d9-4b96-a516-7a00cc1a2578","Type":"ContainerStarted","Data":"4d98df89564c34202eb55889c980b7e1a4e05a9468833306617e20e166bced40"} Apr 16 14:01:35.311871 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:35.311832 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4bp2n" event={"ID":"e8c815b4-e7d9-4b96-a516-7a00cc1a2578","Type":"ContainerStarted","Data":"461d4a19d0480287be4c605c4e0d4af9f09a4262f4cda02ca2fe6d295b93424d"} Apr 16 14:01:35.312330 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:35.312044 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4bp2n" Apr 16 14:01:35.316592 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:35.316569 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4bp2n" Apr 16 14:01:35.332107 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:35.331872 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4bp2n" podStartSLOduration=2.065071672 podStartE2EDuration="3.331857136s" podCreationTimestamp="2026-04-16 14:01:32 +0000 UTC" firstStartedPulling="2026-04-16 14:01:33.17630695 +0000 UTC m=+146.849208223" lastFinishedPulling="2026-04-16 14:01:34.443092396 +0000 UTC m=+148.115993687" observedRunningTime="2026-04-16 14:01:35.331252059 +0000 UTC m=+149.004153352" watchObservedRunningTime="2026-04-16 14:01:35.331857136 +0000 UTC m=+149.004758431" Apr 16 14:01:40.596016 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.595975 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm"] Apr 16 14:01:40.599500 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.599477 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" Apr 16 14:01:40.605181 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.605158 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:01:40.605783 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.605765 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-nfllq\"" Apr 16 14:01:40.606402 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.606131 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:01:40.606402 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.606304 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:01:40.606402 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.606385 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 14:01:40.606586 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.606533 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:01:40.610627 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.610595 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm"] Apr 16 14:01:40.616644 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.616621 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-42bkv"] Apr 16 14:01:40.619173 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.619156 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.621071 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.621053 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:01:40.621201 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.621175 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:01:40.621442 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.621413 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:01:40.621613 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.621454 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wstcq\"" Apr 16 14:01:40.679005 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.678976 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c452e916-8621-4d4c-aee8-8bf9764fa860-node-exporter-wtmp\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.679005 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.679006 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c452e916-8621-4d4c-aee8-8bf9764fa860-root\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.679219 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.679028 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bb6e164-2860-4f91-8060-da98bfd9c9be-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-nzjwm\" (UID: \"2bb6e164-2860-4f91-8060-da98bfd9c9be\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" Apr 16 14:01:40.679219 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.679055 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c452e916-8621-4d4c-aee8-8bf9764fa860-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.679219 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.679125 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c452e916-8621-4d4c-aee8-8bf9764fa860-node-exporter-textfile\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.679219 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.679176 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2bb6e164-2860-4f91-8060-da98bfd9c9be-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-nzjwm\" (UID: \"2bb6e164-2860-4f91-8060-da98bfd9c9be\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" Apr 16 14:01:40.679410 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.679226 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4p78\" (UniqueName: \"kubernetes.io/projected/c452e916-8621-4d4c-aee8-8bf9764fa860-kube-api-access-j4p78\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.679410 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.679259 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhd6k\" (UniqueName: \"kubernetes.io/projected/2bb6e164-2860-4f91-8060-da98bfd9c9be-kube-api-access-qhd6k\") pod \"openshift-state-metrics-5669946b84-nzjwm\" (UID: \"2bb6e164-2860-4f91-8060-da98bfd9c9be\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" Apr 16 14:01:40.679410 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.679288 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c452e916-8621-4d4c-aee8-8bf9764fa860-metrics-client-ca\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.679410 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.679332 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2bb6e164-2860-4f91-8060-da98bfd9c9be-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-nzjwm\" (UID: \"2bb6e164-2860-4f91-8060-da98bfd9c9be\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" Apr 16 14:01:40.679410 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.679363 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c452e916-8621-4d4c-aee8-8bf9764fa860-node-exporter-accelerators-collector-config\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.679410 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.679387 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c452e916-8621-4d4c-aee8-8bf9764fa860-sys\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.679700 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.679424 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c452e916-8621-4d4c-aee8-8bf9764fa860-node-exporter-tls\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.780098 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.780071 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bb6e164-2860-4f91-8060-da98bfd9c9be-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-nzjwm\" (UID: \"2bb6e164-2860-4f91-8060-da98bfd9c9be\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" Apr 16 14:01:40.780098 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.780105 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c452e916-8621-4d4c-aee8-8bf9764fa860-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.780314 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.780122 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c452e916-8621-4d4c-aee8-8bf9764fa860-node-exporter-textfile\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.780314 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.780197 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2bb6e164-2860-4f91-8060-da98bfd9c9be-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-nzjwm\" (UID: \"2bb6e164-2860-4f91-8060-da98bfd9c9be\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" Apr 16 14:01:40.780314 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.780214 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4p78\" (UniqueName: \"kubernetes.io/projected/c452e916-8621-4d4c-aee8-8bf9764fa860-kube-api-access-j4p78\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.780314 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.780238 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhd6k\" (UniqueName: \"kubernetes.io/projected/2bb6e164-2860-4f91-8060-da98bfd9c9be-kube-api-access-qhd6k\") pod \"openshift-state-metrics-5669946b84-nzjwm\" (UID: \"2bb6e164-2860-4f91-8060-da98bfd9c9be\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" Apr 16 14:01:40.780314 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.780264 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c452e916-8621-4d4c-aee8-8bf9764fa860-metrics-client-ca\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.780314 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.780306 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2bb6e164-2860-4f91-8060-da98bfd9c9be-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-nzjwm\" (UID: \"2bb6e164-2860-4f91-8060-da98bfd9c9be\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" Apr 16 14:01:40.780615 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.780336 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c452e916-8621-4d4c-aee8-8bf9764fa860-node-exporter-accelerators-collector-config\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.780615 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.780363 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c452e916-8621-4d4c-aee8-8bf9764fa860-sys\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.780615 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.780429 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c452e916-8621-4d4c-aee8-8bf9764fa860-sys\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.780615 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.780609 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c452e916-8621-4d4c-aee8-8bf9764fa860-node-exporter-textfile\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.780842 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.780819 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c452e916-8621-4d4c-aee8-8bf9764fa860-node-exporter-tls\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.780901 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.780885 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c452e916-8621-4d4c-aee8-8bf9764fa860-node-exporter-wtmp\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.780954 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.780914 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c452e916-8621-4d4c-aee8-8bf9764fa860-root\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.781005 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.780949 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c452e916-8621-4d4c-aee8-8bf9764fa860-metrics-client-ca\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.781005 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.780997 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c452e916-8621-4d4c-aee8-8bf9764fa860-root\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.781100 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:01:40.781051 2582 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:01:40.781163 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:01:40.781118 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c452e916-8621-4d4c-aee8-8bf9764fa860-node-exporter-tls podName:c452e916-8621-4d4c-aee8-8bf9764fa860 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:41.281099685 +0000 UTC m=+154.954000973 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/c452e916-8621-4d4c-aee8-8bf9764fa860-node-exporter-tls") pod "node-exporter-42bkv" (UID: "c452e916-8621-4d4c-aee8-8bf9764fa860") : secret "node-exporter-tls" not found Apr 16 14:01:40.781163 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.781116 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c452e916-8621-4d4c-aee8-8bf9764fa860-node-exporter-wtmp\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.781163 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.781133 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c452e916-8621-4d4c-aee8-8bf9764fa860-node-exporter-accelerators-collector-config\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.781645 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.781625 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2bb6e164-2860-4f91-8060-da98bfd9c9be-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-nzjwm\" (UID: \"2bb6e164-2860-4f91-8060-da98bfd9c9be\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" Apr 16 14:01:40.782894 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.782865 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2bb6e164-2860-4f91-8060-da98bfd9c9be-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-nzjwm\" (UID: \"2bb6e164-2860-4f91-8060-da98bfd9c9be\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" Apr 16 14:01:40.782982 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.782956 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c452e916-8621-4d4c-aee8-8bf9764fa860-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.783229 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.783206 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bb6e164-2860-4f91-8060-da98bfd9c9be-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-nzjwm\" (UID: \"2bb6e164-2860-4f91-8060-da98bfd9c9be\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" Apr 16 14:01:40.792349 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.792327 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4p78\" (UniqueName: \"kubernetes.io/projected/c452e916-8621-4d4c-aee8-8bf9764fa860-kube-api-access-j4p78\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:40.793043 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.793023 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhd6k\" (UniqueName: \"kubernetes.io/projected/2bb6e164-2860-4f91-8060-da98bfd9c9be-kube-api-access-qhd6k\") pod \"openshift-state-metrics-5669946b84-nzjwm\" (UID: \"2bb6e164-2860-4f91-8060-da98bfd9c9be\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" Apr 16 14:01:40.909491 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:40.909414 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" Apr 16 14:01:41.041911 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.041879 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm"] Apr 16 14:01:41.046176 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:01:41.046144 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bb6e164_2860_4f91_8060_da98bfd9c9be.slice/crio-24c14d40926ed881304e20d8bd013acfbef71f1c6d0013ae4141821af5c6d29f WatchSource:0}: Error finding container 24c14d40926ed881304e20d8bd013acfbef71f1c6d0013ae4141821af5c6d29f: Status 404 returned error can't find the container with id 24c14d40926ed881304e20d8bd013acfbef71f1c6d0013ae4141821af5c6d29f Apr 16 14:01:41.285643 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.285608 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c452e916-8621-4d4c-aee8-8bf9764fa860-node-exporter-tls\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:41.287864 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.287845 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c452e916-8621-4d4c-aee8-8bf9764fa860-node-exporter-tls\") pod \"node-exporter-42bkv\" (UID: \"c452e916-8621-4d4c-aee8-8bf9764fa860\") " pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:41.325501 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.325473 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" event={"ID":"2bb6e164-2860-4f91-8060-da98bfd9c9be","Type":"ContainerStarted","Data":"474f5105617328d80fcf422c4d91a4d55976f28ced8a4179c0a33f77ab97b626"} Apr 16 14:01:41.325607 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.325506 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" event={"ID":"2bb6e164-2860-4f91-8060-da98bfd9c9be","Type":"ContainerStarted","Data":"b2ee0da1d76fc9a1f21b1fb5acf9bf5e97d5af1a4ab4b01896c9270a681a2817"} Apr 16 14:01:41.325607 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.325520 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" event={"ID":"2bb6e164-2860-4f91-8060-da98bfd9c9be","Type":"ContainerStarted","Data":"24c14d40926ed881304e20d8bd013acfbef71f1c6d0013ae4141821af5c6d29f"} Apr 16 14:01:41.529183 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.529150 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-42bkv" Apr 16 14:01:41.537859 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:01:41.537827 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc452e916_8621_4d4c_aee8_8bf9764fa860.slice/crio-687b6fcfed11c7d284103779e74f4093368b45d6c9ad05ba724466b35c5c3ff2 WatchSource:0}: Error finding container 687b6fcfed11c7d284103779e74f4093368b45d6c9ad05ba724466b35c5c3ff2: Status 404 returned error can't find the container with id 687b6fcfed11c7d284103779e74f4093368b45d6c9ad05ba724466b35c5c3ff2 Apr 16 14:01:41.661005 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.660948 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:01:41.664559 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.664534 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.666914 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.666885 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 14:01:41.670238 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.670217 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 14:01:41.670406 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.670387 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 14:01:41.670532 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.670455 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 14:01:41.670615 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.670595 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 14:01:41.675316 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.675295 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 14:01:41.675411 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.675335 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 14:01:41.675913 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.675869 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 14:01:41.676073 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.676055 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-m8gcv\"" Apr 16 14:01:41.676178 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.676098 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 14:01:41.680913 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.680893 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:01:41.688424 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.688401 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-config-volume\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.688534 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.688459 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/00213ce9-5c80-4860-9d58-545d4c389ca7-config-out\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.688534 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.688489 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.688534 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.688516 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.688675 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.688614 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbg6r\" (UniqueName: \"kubernetes.io/projected/00213ce9-5c80-4860-9d58-545d4c389ca7-kube-api-access-tbg6r\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.688748 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.688695 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.688748 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.688733 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/00213ce9-5c80-4860-9d58-545d4c389ca7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.688842 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.688757 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00213ce9-5c80-4860-9d58-545d4c389ca7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.688842 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.688784 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.688842 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.688808 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.688977 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.688849 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/00213ce9-5c80-4860-9d58-545d4c389ca7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.688977 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.688900 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00213ce9-5c80-4860-9d58-545d4c389ca7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.688977 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.688953 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-web-config\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.789464 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.789387 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-config-volume\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.789464 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.789454 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/00213ce9-5c80-4860-9d58-545d4c389ca7-config-out\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.789672 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.789485 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.789672 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.789508 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.789827 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.789802 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbg6r\" (UniqueName: \"kubernetes.io/projected/00213ce9-5c80-4860-9d58-545d4c389ca7-kube-api-access-tbg6r\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.789912 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.789892 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.789997 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.789935 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/00213ce9-5c80-4860-9d58-545d4c389ca7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.789997 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.789963 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00213ce9-5c80-4860-9d58-545d4c389ca7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.789997 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.789991 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.790150 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.790017 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.790150 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.790043 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/00213ce9-5c80-4860-9d58-545d4c389ca7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.790150 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.790070 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00213ce9-5c80-4860-9d58-545d4c389ca7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.790150 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.790113 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-web-config\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.790950 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.790922 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/00213ce9-5c80-4860-9d58-545d4c389ca7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.791082 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:01:41.791066 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/00213ce9-5c80-4860-9d58-545d4c389ca7-alertmanager-trusted-ca-bundle podName:00213ce9-5c80-4860-9d58-545d4c389ca7 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:42.291046164 +0000 UTC m=+155.963947437 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/00213ce9-5c80-4860-9d58-545d4c389ca7-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "00213ce9-5c80-4860-9d58-545d4c389ca7") : configmap references non-existent config key: ca-bundle.crt Apr 16 14:01:41.791980 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.791953 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00213ce9-5c80-4860-9d58-545d4c389ca7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.794528 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.794502 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-web-config\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.794991 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.794946 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/00213ce9-5c80-4860-9d58-545d4c389ca7-config-out\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.794991 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.794960 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.795191 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.795107 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.795324 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.795303 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-config-volume\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.797727 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.797400 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.798044 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.798006 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.798152 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.798125 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.799459 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.799432 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/00213ce9-5c80-4860-9d58-545d4c389ca7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:41.800743 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:41.800702 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbg6r\" (UniqueName: \"kubernetes.io/projected/00213ce9-5c80-4860-9d58-545d4c389ca7-kube-api-access-tbg6r\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:42.294989 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:42.294945 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00213ce9-5c80-4860-9d58-545d4c389ca7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:42.295915 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:42.295892 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00213ce9-5c80-4860-9d58-545d4c389ca7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:42.331496 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:42.331405 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" event={"ID":"2bb6e164-2860-4f91-8060-da98bfd9c9be","Type":"ContainerStarted","Data":"1dca2f1d7e1f7f906d6572cf3922f4cebb58690745de1d3d93038890dea03a17"} Apr 16 14:01:42.332644 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:42.332617 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-42bkv" event={"ID":"c452e916-8621-4d4c-aee8-8bf9764fa860","Type":"ContainerStarted","Data":"687b6fcfed11c7d284103779e74f4093368b45d6c9ad05ba724466b35c5c3ff2"} Apr 16 14:01:42.351484 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:42.351429 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-nzjwm" podStartSLOduration=1.456582082 podStartE2EDuration="2.35141192s" podCreationTimestamp="2026-04-16 14:01:40 +0000 UTC" firstStartedPulling="2026-04-16 14:01:41.16405704 +0000 UTC m=+154.836958313" lastFinishedPulling="2026-04-16 14:01:42.058886874 +0000 UTC m=+155.731788151" observedRunningTime="2026-04-16 14:01:42.350891546 +0000 UTC m=+156.023792842" watchObservedRunningTime="2026-04-16 14:01:42.35141192 +0000 UTC m=+156.024313218" Apr 16 14:01:42.576189 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:42.576168 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:01:42.675228 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:01:42.675185 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" podUID="31e70218-76f5-466f-8893-9b596d11423e" Apr 16 14:01:42.699370 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:01:42.699318 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-8cbxc" podUID="0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9" Apr 16 14:01:42.707663 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:01:42.707635 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-vf79m" podUID="41223147-714d-4ec2-a7b7-5febd776c247" Apr 16 14:01:42.718866 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:42.718843 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:01:42.755538 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:01:42.755509 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00213ce9_5c80_4860_9d58_545d4c389ca7.slice/crio-b90c32d3d7ee8aed9efdd6d99a8fb2674fccf116a69dcdfd170f0d1dbf4b641d WatchSource:0}: Error finding container b90c32d3d7ee8aed9efdd6d99a8fb2674fccf116a69dcdfd170f0d1dbf4b641d: Status 404 returned error can't find the container with id b90c32d3d7ee8aed9efdd6d99a8fb2674fccf116a69dcdfd170f0d1dbf4b641d Apr 16 14:01:43.337000 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.336970 2582 generic.go:358] "Generic (PLEG): container finished" podID="c452e916-8621-4d4c-aee8-8bf9764fa860" containerID="bb5d0eca6d9b16827e1653eca9358cb0045c41206ee8fad260c7d06efa123a37" exitCode=0 Apr 16 14:01:43.337148 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.337038 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-42bkv" event={"ID":"c452e916-8621-4d4c-aee8-8bf9764fa860","Type":"ContainerDied","Data":"bb5d0eca6d9b16827e1653eca9358cb0045c41206ee8fad260c7d06efa123a37"} Apr 16 14:01:43.338139 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.338117 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vf79m" Apr 16 14:01:43.338209 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.338136 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00213ce9-5c80-4860-9d58-545d4c389ca7","Type":"ContainerStarted","Data":"b90c32d3d7ee8aed9efdd6d99a8fb2674fccf116a69dcdfd170f0d1dbf4b641d"} Apr 16 14:01:43.338209 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.338158 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 14:01:43.338487 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.338470 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8cbxc" Apr 16 14:01:43.668498 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.668404 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7759bb9fbf-bph6s"] Apr 16 14:01:43.671040 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.671016 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.673236 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.673217 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 14:01:43.673407 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.673298 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-n9t65\"" Apr 16 14:01:43.673501 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.673475 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-2rhodgml9ninc\"" Apr 16 14:01:43.673501 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.673496 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 14:01:43.673714 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.673514 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 14:01:43.673714 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.673633 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 14:01:43.673804 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.673760 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 14:01:43.686189 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.686164 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7759bb9fbf-bph6s"] Apr 16 14:01:43.708768 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.708739 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0d55217e-e9ef-473d-9e26-0468e457f308-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.708959 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.708773 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0d55217e-e9ef-473d-9e26-0468e457f308-secret-grpc-tls\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.708959 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.708805 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbsgs\" (UniqueName: \"kubernetes.io/projected/0d55217e-e9ef-473d-9e26-0468e457f308-kube-api-access-rbsgs\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.708959 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.708852 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0d55217e-e9ef-473d-9e26-0468e457f308-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.708959 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.708903 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d55217e-e9ef-473d-9e26-0468e457f308-metrics-client-ca\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.708959 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.708921 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/0d55217e-e9ef-473d-9e26-0468e457f308-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.709248 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.709012 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/0d55217e-e9ef-473d-9e26-0468e457f308-secret-thanos-querier-tls\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.709248 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.709055 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/0d55217e-e9ef-473d-9e26-0468e457f308-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.810218 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.810173 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d55217e-e9ef-473d-9e26-0468e457f308-metrics-client-ca\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.810355 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.810229 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/0d55217e-e9ef-473d-9e26-0468e457f308-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.810355 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.810282 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/0d55217e-e9ef-473d-9e26-0468e457f308-secret-thanos-querier-tls\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.810355 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.810313 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/0d55217e-e9ef-473d-9e26-0468e457f308-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.810509 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.810376 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0d55217e-e9ef-473d-9e26-0468e457f308-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.810509 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.810404 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0d55217e-e9ef-473d-9e26-0468e457f308-secret-grpc-tls\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.810509 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.810433 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbsgs\" (UniqueName: \"kubernetes.io/projected/0d55217e-e9ef-473d-9e26-0468e457f308-kube-api-access-rbsgs\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.810509 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.810481 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0d55217e-e9ef-473d-9e26-0468e457f308-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.811387 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.811357 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d55217e-e9ef-473d-9e26-0468e457f308-metrics-client-ca\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.813041 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.813010 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/0d55217e-e9ef-473d-9e26-0468e457f308-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.813305 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.813283 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0d55217e-e9ef-473d-9e26-0468e457f308-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.813800 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.813777 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0d55217e-e9ef-473d-9e26-0468e457f308-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.813800 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.813788 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/0d55217e-e9ef-473d-9e26-0468e457f308-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.813939 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.813799 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0d55217e-e9ef-473d-9e26-0468e457f308-secret-grpc-tls\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.813939 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.813856 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/0d55217e-e9ef-473d-9e26-0468e457f308-secret-thanos-querier-tls\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.818001 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.817982 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbsgs\" (UniqueName: \"kubernetes.io/projected/0d55217e-e9ef-473d-9e26-0468e457f308-kube-api-access-rbsgs\") pod \"thanos-querier-7759bb9fbf-bph6s\" (UID: \"0d55217e-e9ef-473d-9e26-0468e457f308\") " pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:43.980809 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:43.980777 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:44.123564 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:44.123540 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7759bb9fbf-bph6s"] Apr 16 14:01:44.342908 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:44.342837 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-42bkv" event={"ID":"c452e916-8621-4d4c-aee8-8bf9764fa860","Type":"ContainerStarted","Data":"46e3379ae34cff10be705e757f0d16f8e373acdf7202292014bbacec5e6ea512"} Apr 16 14:01:44.342908 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:44.342869 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-42bkv" event={"ID":"c452e916-8621-4d4c-aee8-8bf9764fa860","Type":"ContainerStarted","Data":"962796b81328aad586402fca72fe691a2d1d6453392816c232aec89ee9486358"} Apr 16 14:01:44.347258 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:01:44.347232 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d55217e_e9ef_473d_9e26_0468e457f308.slice/crio-9644dbffe8f8bf079041013a46793447d842f73b2dab62677ed9b222a9b3f43d WatchSource:0}: Error finding container 9644dbffe8f8bf079041013a46793447d842f73b2dab62677ed9b222a9b3f43d: Status 404 returned error can't find the container with id 9644dbffe8f8bf079041013a46793447d842f73b2dab62677ed9b222a9b3f43d Apr 16 14:01:44.379116 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:44.379075 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-42bkv" podStartSLOduration=3.404430712 podStartE2EDuration="4.379061374s" podCreationTimestamp="2026-04-16 14:01:40 +0000 UTC" firstStartedPulling="2026-04-16 14:01:41.539420128 +0000 UTC m=+155.212321400" lastFinishedPulling="2026-04-16 14:01:42.514050786 +0000 UTC m=+156.186952062" observedRunningTime="2026-04-16 14:01:44.377152063 +0000 UTC m=+158.050053368" watchObservedRunningTime="2026-04-16 14:01:44.379061374 +0000 UTC m=+158.051962667" Apr 16 14:01:45.016001 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.015968 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-84c54b594d-j8qcd"] Apr 16 14:01:45.017848 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.017832 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.020205 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.020186 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 14:01:45.020205 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.020193 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 14:01:45.020375 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.020220 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-ea0ccvi33tvrm\"" Apr 16 14:01:45.020556 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.020540 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 14:01:45.020638 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.020620 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:01:45.020873 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.020855 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-w6df7\"" Apr 16 14:01:45.034556 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.034535 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-84c54b594d-j8qcd"] Apr 16 14:01:45.124143 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.124115 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/61818717-39e9-433d-91cd-4f4e4264af2c-secret-metrics-server-client-certs\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.124286 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.124151 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/61818717-39e9-433d-91cd-4f4e4264af2c-secret-metrics-server-tls\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.124286 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.124188 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61818717-39e9-433d-91cd-4f4e4264af2c-client-ca-bundle\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.124286 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.124217 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/61818717-39e9-433d-91cd-4f4e4264af2c-audit-log\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.124286 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.124240 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61818717-39e9-433d-91cd-4f4e4264af2c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.124424 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.124323 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz4zc\" (UniqueName: \"kubernetes.io/projected/61818717-39e9-433d-91cd-4f4e4264af2c-kube-api-access-vz4zc\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.124424 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.124370 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/61818717-39e9-433d-91cd-4f4e4264af2c-metrics-server-audit-profiles\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.225189 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.225149 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/61818717-39e9-433d-91cd-4f4e4264af2c-metrics-server-audit-profiles\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.225322 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.225226 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/61818717-39e9-433d-91cd-4f4e4264af2c-secret-metrics-server-client-certs\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.225322 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.225247 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/61818717-39e9-433d-91cd-4f4e4264af2c-secret-metrics-server-tls\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.225322 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.225272 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61818717-39e9-433d-91cd-4f4e4264af2c-client-ca-bundle\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.225322 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.225293 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/61818717-39e9-433d-91cd-4f4e4264af2c-audit-log\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.225322 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.225315 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61818717-39e9-433d-91cd-4f4e4264af2c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.225524 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.225487 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vz4zc\" (UniqueName: \"kubernetes.io/projected/61818717-39e9-433d-91cd-4f4e4264af2c-kube-api-access-vz4zc\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.225774 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.225726 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/61818717-39e9-433d-91cd-4f4e4264af2c-audit-log\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.226048 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.226021 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61818717-39e9-433d-91cd-4f4e4264af2c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.226262 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.226244 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/61818717-39e9-433d-91cd-4f4e4264af2c-metrics-server-audit-profiles\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.227902 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.227880 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/61818717-39e9-433d-91cd-4f4e4264af2c-secret-metrics-server-tls\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.228168 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.228154 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61818717-39e9-433d-91cd-4f4e4264af2c-client-ca-bundle\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.228359 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.228339 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/61818717-39e9-433d-91cd-4f4e4264af2c-secret-metrics-server-client-certs\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.234538 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.234517 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz4zc\" (UniqueName: \"kubernetes.io/projected/61818717-39e9-433d-91cd-4f4e4264af2c-kube-api-access-vz4zc\") pod \"metrics-server-84c54b594d-j8qcd\" (UID: \"61818717-39e9-433d-91cd-4f4e4264af2c\") " pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.327146 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.327073 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:01:45.347274 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.347249 2582 generic.go:358] "Generic (PLEG): container finished" podID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerID="e726b98f193f22bfd01bfec1181f7c8ffbd304e806fda7ca59ab8895900b4cd5" exitCode=0 Apr 16 14:01:45.347399 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.347313 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00213ce9-5c80-4860-9d58-545d4c389ca7","Type":"ContainerDied","Data":"e726b98f193f22bfd01bfec1181f7c8ffbd304e806fda7ca59ab8895900b4cd5"} Apr 16 14:01:45.349105 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.349078 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" event={"ID":"0d55217e-e9ef-473d-9e26-0468e457f308","Type":"ContainerStarted","Data":"9644dbffe8f8bf079041013a46793447d842f73b2dab62677ed9b222a9b3f43d"} Apr 16 14:01:45.358903 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.358879 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-sn5zt"] Apr 16 14:01:45.362194 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.362176 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-sn5zt" Apr 16 14:01:45.364388 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.364370 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 14:01:45.364487 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.364399 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-vlw77\"" Apr 16 14:01:45.370395 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.370235 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-sn5zt"] Apr 16 14:01:45.427416 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.427373 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/449fa6c3-c8bd-4782-8e51-3417426d364f-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-sn5zt\" (UID: \"449fa6c3-c8bd-4782-8e51-3417426d364f\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-sn5zt" Apr 16 14:01:45.471396 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.471340 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-84c54b594d-j8qcd"] Apr 16 14:01:45.474564 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:01:45.474537 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61818717_39e9_433d_91cd_4f4e4264af2c.slice/crio-3d980b395bf6d8b7f5080211f3439da27e11008286127dd54c0b2b65976f9728 WatchSource:0}: Error finding container 3d980b395bf6d8b7f5080211f3439da27e11008286127dd54c0b2b65976f9728: Status 404 returned error can't find the container with id 3d980b395bf6d8b7f5080211f3439da27e11008286127dd54c0b2b65976f9728 Apr 16 14:01:45.528866 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.528827 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/449fa6c3-c8bd-4782-8e51-3417426d364f-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-sn5zt\" (UID: \"449fa6c3-c8bd-4782-8e51-3417426d364f\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-sn5zt" Apr 16 14:01:45.531485 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.531463 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/449fa6c3-c8bd-4782-8e51-3417426d364f-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-sn5zt\" (UID: \"449fa6c3-c8bd-4782-8e51-3417426d364f\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-sn5zt" Apr 16 14:01:45.675105 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.675021 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-sn5zt" Apr 16 14:01:45.815093 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:45.815056 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-sn5zt"] Apr 16 14:01:45.819916 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:01:45.819885 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod449fa6c3_c8bd_4782_8e51_3417426d364f.slice/crio-71fc87fcf9fccd7b5d23e9b59b5d84628103a9ca0e1f98e5890aae4c2d923b25 WatchSource:0}: Error finding container 71fc87fcf9fccd7b5d23e9b59b5d84628103a9ca0e1f98e5890aae4c2d923b25: Status 404 returned error can't find the container with id 71fc87fcf9fccd7b5d23e9b59b5d84628103a9ca0e1f98e5890aae4c2d923b25 Apr 16 14:01:46.353524 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:46.353487 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" event={"ID":"61818717-39e9-433d-91cd-4f4e4264af2c","Type":"ContainerStarted","Data":"3d980b395bf6d8b7f5080211f3439da27e11008286127dd54c0b2b65976f9728"} Apr 16 14:01:46.354662 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:46.354628 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-sn5zt" event={"ID":"449fa6c3-c8bd-4782-8e51-3417426d364f","Type":"ContainerStarted","Data":"71fc87fcf9fccd7b5d23e9b59b5d84628103a9ca0e1f98e5890aae4c2d923b25"} Apr 16 14:01:47.360963 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.360928 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00213ce9-5c80-4860-9d58-545d4c389ca7","Type":"ContainerStarted","Data":"79d24c810655135e1a33d3218ff3259050e0bee8a2569354636c55ac452b5ef1"} Apr 16 14:01:47.361399 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.360972 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00213ce9-5c80-4860-9d58-545d4c389ca7","Type":"ContainerStarted","Data":"51b7e013402397b72e677a90faefaa9f0cbaa67170f733f954a1feb039e5ed69"} Apr 16 14:01:47.361399 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.360987 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00213ce9-5c80-4860-9d58-545d4c389ca7","Type":"ContainerStarted","Data":"ebdbc12e256bd71f9d856d0e44e35705ed37dd2fb2f5b7074cfdf9576b0e6f6c"} Apr 16 14:01:47.361399 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.360997 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00213ce9-5c80-4860-9d58-545d4c389ca7","Type":"ContainerStarted","Data":"7efba38b1c31dccfd25122e71e23fe2ae303718806915fd37b40ff000030ee02"} Apr 16 14:01:47.361399 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.361005 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00213ce9-5c80-4860-9d58-545d4c389ca7","Type":"ContainerStarted","Data":"9f5a92a85ff9325f826e33893bb7e9561e9edace63d1d40f250a716fc903de66"} Apr 16 14:01:47.363332 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.363306 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" event={"ID":"0d55217e-e9ef-473d-9e26-0468e457f308","Type":"ContainerStarted","Data":"906a84a0be2e87feaa5da9b4d326db307ac8c225bda46a171e3d807b79e1382b"} Apr 16 14:01:47.363445 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.363338 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" event={"ID":"0d55217e-e9ef-473d-9e26-0468e457f308","Type":"ContainerStarted","Data":"b0e6aa089b2836f36f986ce913ff91f9a932d9eaf784e1ef56f7abf55c3e6791"} Apr 16 14:01:47.363445 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.363352 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" event={"ID":"0d55217e-e9ef-473d-9e26-0468e457f308","Type":"ContainerStarted","Data":"b0e4c96384e50da6622744642b7b6fbe7b1d72c787e2173b8596fca6d8674d49"} Apr 16 14:01:47.649783 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.649735 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 14:01:47.649946 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.649798 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 14:01:47.649946 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.649836 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert\") pod \"ingress-canary-vf79m\" (UID: \"41223147-714d-4ec2-a7b7-5febd776c247\") " pod="openshift-ingress-canary/ingress-canary-vf79m" Apr 16 14:01:47.652296 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.652261 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9-metrics-tls\") pod \"dns-default-8cbxc\" (UID: \"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9\") " pod="openshift-dns/dns-default-8cbxc" Apr 16 14:01:47.652652 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.652634 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41223147-714d-4ec2-a7b7-5febd776c247-cert\") pod \"ingress-canary-vf79m\" (UID: \"41223147-714d-4ec2-a7b7-5febd776c247\") " pod="openshift-ingress-canary/ingress-canary-vf79m" Apr 16 14:01:47.652769 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.652748 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls\") pod \"image-registry-6cb8b4dbdb-w7k9w\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 14:01:47.841341 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.841316 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tpzlf\"" Apr 16 14:01:47.841509 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.841347 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7shls\"" Apr 16 14:01:47.841509 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.841314 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbpqw\"" Apr 16 14:01:47.848965 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.848941 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vf79m" Apr 16 14:01:47.848965 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.848961 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8cbxc" Apr 16 14:01:47.849138 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:47.849073 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 14:01:48.370294 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:48.369962 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-sn5zt" event={"ID":"449fa6c3-c8bd-4782-8e51-3417426d364f","Type":"ContainerStarted","Data":"4cbfee58e66a24c2ac741836fa5194e655f032bbdae5f68ece8fb4ab68108de8"} Apr 16 14:01:48.374094 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:48.374064 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-sn5zt" Apr 16 14:01:48.380499 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:48.380250 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" event={"ID":"61818717-39e9-433d-91cd-4f4e4264af2c","Type":"ContainerStarted","Data":"122804fc3ab9786a5c1705ff1356ea0b597de13cf3d877e19578604bc8159d2c"} Apr 16 14:01:48.381312 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:48.381286 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-sn5zt" Apr 16 14:01:48.388954 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:48.388932 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w"] Apr 16 14:01:48.393436 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:48.393380 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-sn5zt" podStartSLOduration=1.008694499 podStartE2EDuration="3.393363517s" podCreationTimestamp="2026-04-16 14:01:45 +0000 UTC" firstStartedPulling="2026-04-16 14:01:45.823320671 +0000 UTC m=+159.496221959" lastFinishedPulling="2026-04-16 14:01:48.20798969 +0000 UTC m=+161.880890977" observedRunningTime="2026-04-16 14:01:48.391085269 +0000 UTC m=+162.063986575" watchObservedRunningTime="2026-04-16 14:01:48.393363517 +0000 UTC m=+162.066264812" Apr 16 14:01:48.394783 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:01:48.394581 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31e70218_76f5_466f_8893_9b596d11423e.slice/crio-0dab6c4f7d37da254719faa891159acba10d02f8166d88d1d75981e5535b1eab WatchSource:0}: Error finding container 0dab6c4f7d37da254719faa891159acba10d02f8166d88d1d75981e5535b1eab: Status 404 returned error can't find the container with id 0dab6c4f7d37da254719faa891159acba10d02f8166d88d1d75981e5535b1eab Apr 16 14:01:48.404959 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:48.404938 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vf79m"] Apr 16 14:01:48.407107 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:01:48.407085 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41223147_714d_4ec2_a7b7_5febd776c247.slice/crio-628f5d14a0a8d442e8198624814d26339d08e94d00e9e7c4ce779dfc669cbb75 WatchSource:0}: Error finding container 628f5d14a0a8d442e8198624814d26339d08e94d00e9e7c4ce779dfc669cbb75: Status 404 returned error can't find the container with id 628f5d14a0a8d442e8198624814d26339d08e94d00e9e7c4ce779dfc669cbb75 Apr 16 14:01:48.413654 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:48.413617 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" podStartSLOduration=1.678868496 podStartE2EDuration="4.413605503s" podCreationTimestamp="2026-04-16 14:01:44 +0000 UTC" firstStartedPulling="2026-04-16 14:01:45.476762209 +0000 UTC m=+159.149663486" lastFinishedPulling="2026-04-16 14:01:48.211499213 +0000 UTC m=+161.884400493" observedRunningTime="2026-04-16 14:01:48.412166972 +0000 UTC m=+162.085068266" watchObservedRunningTime="2026-04-16 14:01:48.413605503 +0000 UTC m=+162.086506797" Apr 16 14:01:48.427560 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:48.427536 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8cbxc"] Apr 16 14:01:48.432432 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:01:48.432406 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bcafdf7_3f9d_4f9c_baf2_5a3c0edbf4b9.slice/crio-0cc76950752adbde96841eed6f46278c547c594dd6bb2ff71fefde3586e4ad3e WatchSource:0}: Error finding container 0cc76950752adbde96841eed6f46278c547c594dd6bb2ff71fefde3586e4ad3e: Status 404 returned error can't find the container with id 0cc76950752adbde96841eed6f46278c547c594dd6bb2ff71fefde3586e4ad3e Apr 16 14:01:49.387695 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:49.387630 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00213ce9-5c80-4860-9d58-545d4c389ca7","Type":"ContainerStarted","Data":"f53e14ecb0b6b8c3967cbe1152cc4ba85517255f8bdc12d49ae7964718c645b1"} Apr 16 14:01:49.390453 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:49.390419 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" event={"ID":"0d55217e-e9ef-473d-9e26-0468e457f308","Type":"ContainerStarted","Data":"6d45cc3e7bc5d95f8d5e19e110ff444b8912b661c46266c191c5e5ba6d219605"} Apr 16 14:01:49.390453 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:49.390455 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" event={"ID":"0d55217e-e9ef-473d-9e26-0468e457f308","Type":"ContainerStarted","Data":"a45d50669785fcd878465cfbbd4f98addaf74af2e3dd00ab2c848e5b5537a870"} Apr 16 14:01:49.390646 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:49.390470 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" event={"ID":"0d55217e-e9ef-473d-9e26-0468e457f308","Type":"ContainerStarted","Data":"07534b096b00b6ee0cf4aa24bcf256316de97b866c4b882301bce0cf4876d6ee"} Apr 16 14:01:49.390646 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:49.390577 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:01:49.391594 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:49.391573 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vf79m" event={"ID":"41223147-714d-4ec2-a7b7-5febd776c247","Type":"ContainerStarted","Data":"628f5d14a0a8d442e8198624814d26339d08e94d00e9e7c4ce779dfc669cbb75"} Apr 16 14:01:49.393093 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:49.393071 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" event={"ID":"31e70218-76f5-466f-8893-9b596d11423e","Type":"ContainerStarted","Data":"30301edcb0927c2a42d692a1a9f4befdf5c4879c3026fc948e915e8f9ab2f918"} Apr 16 14:01:49.393199 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:49.393100 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" event={"ID":"31e70218-76f5-466f-8893-9b596d11423e","Type":"ContainerStarted","Data":"0dab6c4f7d37da254719faa891159acba10d02f8166d88d1d75981e5535b1eab"} Apr 16 14:01:49.393199 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:49.393165 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 14:01:49.394181 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:49.394160 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8cbxc" event={"ID":"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9","Type":"ContainerStarted","Data":"0cc76950752adbde96841eed6f46278c547c594dd6bb2ff71fefde3586e4ad3e"} Apr 16 14:01:49.428796 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:49.428754 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.496586256 podStartE2EDuration="8.428742094s" podCreationTimestamp="2026-04-16 14:01:41 +0000 UTC" firstStartedPulling="2026-04-16 14:01:42.75735066 +0000 UTC m=+156.430251931" lastFinishedPulling="2026-04-16 14:01:48.68950649 +0000 UTC m=+162.362407769" observedRunningTime="2026-04-16 14:01:49.42801258 +0000 UTC m=+163.100913877" watchObservedRunningTime="2026-04-16 14:01:49.428742094 +0000 UTC m=+163.101643387" Apr 16 14:01:49.462894 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:49.462827 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" podStartSLOduration=162.462807757 podStartE2EDuration="2m42.462807757s" podCreationTimestamp="2026-04-16 13:59:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:49.462528609 +0000 UTC m=+163.135429905" watchObservedRunningTime="2026-04-16 14:01:49.462807757 +0000 UTC m=+163.135709052" Apr 16 14:01:49.522469 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:49.522424 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" podStartSLOduration=2.180396194 podStartE2EDuration="6.522408005s" podCreationTimestamp="2026-04-16 14:01:43 +0000 UTC" firstStartedPulling="2026-04-16 14:01:44.349144175 +0000 UTC m=+158.022045447" lastFinishedPulling="2026-04-16 14:01:48.691155972 +0000 UTC m=+162.364057258" observedRunningTime="2026-04-16 14:01:49.519209572 +0000 UTC m=+163.192110878" watchObservedRunningTime="2026-04-16 14:01:49.522408005 +0000 UTC m=+163.195309298" Apr 16 14:01:50.400233 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:50.400132 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8cbxc" event={"ID":"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9","Type":"ContainerStarted","Data":"11374bf538b54edba631c0534b346b3397a8a803b220db0b9a932c6ddce2d2f1"} Apr 16 14:01:50.400233 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:50.400183 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8cbxc" event={"ID":"0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9","Type":"ContainerStarted","Data":"45a1cd374ee2471c4ac68e940b0e61ff3757737f9a9980a00415289c0fe4746e"} Apr 16 14:01:50.421972 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:50.421875 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8cbxc" podStartSLOduration=130.038663024 podStartE2EDuration="2m11.421857881s" podCreationTimestamp="2026-04-16 13:59:39 +0000 UTC" firstStartedPulling="2026-04-16 14:01:48.435305473 +0000 UTC m=+162.108206753" lastFinishedPulling="2026-04-16 14:01:49.818500335 +0000 UTC m=+163.491401610" observedRunningTime="2026-04-16 14:01:50.41947042 +0000 UTC m=+164.092371751" watchObservedRunningTime="2026-04-16 14:01:50.421857881 +0000 UTC m=+164.094759175" Apr 16 14:01:51.404978 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:51.404891 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vf79m" event={"ID":"41223147-714d-4ec2-a7b7-5febd776c247","Type":"ContainerStarted","Data":"7835be8e62bc9ed6b27a747f5bafc7c9bcc6298d292b25dc77510dd5e65d2c73"} Apr 16 14:01:51.405361 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:51.405082 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-8cbxc" Apr 16 14:01:51.427497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:51.427448 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vf79m" podStartSLOduration=129.821846671 podStartE2EDuration="2m12.427431204s" podCreationTimestamp="2026-04-16 13:59:39 +0000 UTC" firstStartedPulling="2026-04-16 14:01:48.408995501 +0000 UTC m=+162.081896780" lastFinishedPulling="2026-04-16 14:01:51.014580041 +0000 UTC m=+164.687481313" observedRunningTime="2026-04-16 14:01:51.426576424 +0000 UTC m=+165.099477720" watchObservedRunningTime="2026-04-16 14:01:51.427431204 +0000 UTC m=+165.100332529" Apr 16 14:01:55.406293 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:01:55.406264 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7759bb9fbf-bph6s" Apr 16 14:02:01.410769 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:01.410741 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8cbxc" Apr 16 14:02:04.939249 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:04.939214 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c59d6c579-gftxh"] Apr 16 14:02:04.942743 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:04.942720 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:04.944967 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:04.944947 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:02:04.945271 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:04.945245 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:02:04.945365 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:04.945271 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:02:04.945365 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:04.945330 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-vvc8w\"" Apr 16 14:02:04.945520 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:04.945498 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:02:04.945520 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:04.945502 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:02:04.945668 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:04.945560 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:02:04.945820 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:04.945805 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:02:04.953775 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:04.953754 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c59d6c579-gftxh"] Apr 16 14:02:05.101918 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.101866 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/560a89dd-7638-47e3-9703-3c403ab2ff34-console-oauth-config\") pod \"console-6c59d6c579-gftxh\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.101918 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.101920 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/560a89dd-7638-47e3-9703-3c403ab2ff34-console-serving-cert\") pod \"console-6c59d6c579-gftxh\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.102165 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.101943 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/560a89dd-7638-47e3-9703-3c403ab2ff34-console-config\") pod \"console-6c59d6c579-gftxh\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.102165 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.102015 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/560a89dd-7638-47e3-9703-3c403ab2ff34-oauth-serving-cert\") pod \"console-6c59d6c579-gftxh\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.102165 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.102060 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/560a89dd-7638-47e3-9703-3c403ab2ff34-service-ca\") pod \"console-6c59d6c579-gftxh\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.102165 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.102131 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsghd\" (UniqueName: \"kubernetes.io/projected/560a89dd-7638-47e3-9703-3c403ab2ff34-kube-api-access-bsghd\") pod \"console-6c59d6c579-gftxh\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.203203 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.203119 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/560a89dd-7638-47e3-9703-3c403ab2ff34-console-oauth-config\") pod \"console-6c59d6c579-gftxh\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.203203 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.203159 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/560a89dd-7638-47e3-9703-3c403ab2ff34-console-serving-cert\") pod \"console-6c59d6c579-gftxh\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.203203 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.203182 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/560a89dd-7638-47e3-9703-3c403ab2ff34-console-config\") pod \"console-6c59d6c579-gftxh\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.203203 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.203205 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/560a89dd-7638-47e3-9703-3c403ab2ff34-oauth-serving-cert\") pod \"console-6c59d6c579-gftxh\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.203487 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.203319 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/560a89dd-7638-47e3-9703-3c403ab2ff34-service-ca\") pod \"console-6c59d6c579-gftxh\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.203487 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.203364 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsghd\" (UniqueName: \"kubernetes.io/projected/560a89dd-7638-47e3-9703-3c403ab2ff34-kube-api-access-bsghd\") pod \"console-6c59d6c579-gftxh\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.203998 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.203974 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/560a89dd-7638-47e3-9703-3c403ab2ff34-service-ca\") pod \"console-6c59d6c579-gftxh\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.204115 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.203995 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/560a89dd-7638-47e3-9703-3c403ab2ff34-oauth-serving-cert\") pod \"console-6c59d6c579-gftxh\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.204115 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.204049 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/560a89dd-7638-47e3-9703-3c403ab2ff34-console-config\") pod \"console-6c59d6c579-gftxh\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.205621 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.205601 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/560a89dd-7638-47e3-9703-3c403ab2ff34-console-oauth-config\") pod \"console-6c59d6c579-gftxh\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.205749 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.205671 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/560a89dd-7638-47e3-9703-3c403ab2ff34-console-serving-cert\") pod \"console-6c59d6c579-gftxh\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.211746 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.211724 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsghd\" (UniqueName: \"kubernetes.io/projected/560a89dd-7638-47e3-9703-3c403ab2ff34-kube-api-access-bsghd\") pod \"console-6c59d6c579-gftxh\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.251572 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.251544 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:05.328066 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.328030 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:02:05.328202 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.328100 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:02:05.372570 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.372537 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c59d6c579-gftxh"] Apr 16 14:02:05.375406 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:02:05.375377 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod560a89dd_7638_47e3_9703_3c403ab2ff34.slice/crio-dae694c394cd9f98c62994324622a5be3fe530b4121ad5ee313ca158aba47ef0 WatchSource:0}: Error finding container dae694c394cd9f98c62994324622a5be3fe530b4121ad5ee313ca158aba47ef0: Status 404 returned error can't find the container with id dae694c394cd9f98c62994324622a5be3fe530b4121ad5ee313ca158aba47ef0 Apr 16 14:02:05.447497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:05.447455 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c59d6c579-gftxh" event={"ID":"560a89dd-7638-47e3-9703-3c403ab2ff34","Type":"ContainerStarted","Data":"dae694c394cd9f98c62994324622a5be3fe530b4121ad5ee313ca158aba47ef0"} Apr 16 14:02:07.853738 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:07.853699 2582 patch_prober.go:28] interesting pod/image-registry-6cb8b4dbdb-w7k9w container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:02:07.854184 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:07.853770 2582 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" podUID="31e70218-76f5-466f-8893-9b596d11423e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:02:09.461035 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:09.460998 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c59d6c579-gftxh" event={"ID":"560a89dd-7638-47e3-9703-3c403ab2ff34","Type":"ContainerStarted","Data":"3f1bea9b95d3ac514f63e962973e968ccad99c1716565993a73eec2debef7db7"} Apr 16 14:02:09.477932 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:09.477891 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c59d6c579-gftxh" podStartSLOduration=2.395963383 podStartE2EDuration="5.477878478s" podCreationTimestamp="2026-04-16 14:02:04 +0000 UTC" firstStartedPulling="2026-04-16 14:02:05.377450794 +0000 UTC m=+179.050352066" lastFinishedPulling="2026-04-16 14:02:08.459365884 +0000 UTC m=+182.132267161" observedRunningTime="2026-04-16 14:02:09.477269305 +0000 UTC m=+183.150170600" watchObservedRunningTime="2026-04-16 14:02:09.477878478 +0000 UTC m=+183.150779772" Apr 16 14:02:10.404662 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:10.404624 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 14:02:14.275560 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:14.275517 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w"] Apr 16 14:02:15.252137 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:15.252100 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:15.252137 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:15.252141 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:15.256554 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:15.256534 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:15.484874 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:15.484843 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:16.484860 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:16.484828 2582 generic.go:358] "Generic (PLEG): container finished" podID="f724d942-1eee-4167-a883-bbc5be00af26" containerID="6f6d25644c2b3dac139ddf44d1c2053761b17d246940a7bac18d6cf315890566" exitCode=0 Apr 16 14:02:16.485038 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:16.484900 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj" event={"ID":"f724d942-1eee-4167-a883-bbc5be00af26","Type":"ContainerDied","Data":"6f6d25644c2b3dac139ddf44d1c2053761b17d246940a7bac18d6cf315890566"} Apr 16 14:02:16.485315 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:16.485264 2582 scope.go:117] "RemoveContainer" containerID="6f6d25644c2b3dac139ddf44d1c2053761b17d246940a7bac18d6cf315890566" Apr 16 14:02:17.489274 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:17.489241 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-h82nj" event={"ID":"f724d942-1eee-4167-a883-bbc5be00af26","Type":"ContainerStarted","Data":"feea7268528b3b86a319a95d8db32e67479d82f0afbbc91d55242b306a905dfa"} Apr 16 14:02:24.971359 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:24.971330 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c59d6c579-gftxh"] Apr 16 14:02:25.333199 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:25.333121 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:02:25.336994 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:25.336970 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-84c54b594d-j8qcd" Apr 16 14:02:34.657840 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:34.657805 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_00213ce9-5c80-4860-9d58-545d4c389ca7/init-config-reloader/0.log" Apr 16 14:02:34.855443 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:34.855413 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_00213ce9-5c80-4860-9d58-545d4c389ca7/alertmanager/0.log" Apr 16 14:02:35.055595 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:35.055572 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_00213ce9-5c80-4860-9d58-545d4c389ca7/config-reloader/0.log" Apr 16 14:02:35.254808 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:35.254778 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_00213ce9-5c80-4860-9d58-545d4c389ca7/kube-rbac-proxy-web/0.log" Apr 16 14:02:35.454782 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:35.454716 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_00213ce9-5c80-4860-9d58-545d4c389ca7/kube-rbac-proxy/0.log" Apr 16 14:02:35.655350 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:35.655317 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_00213ce9-5c80-4860-9d58-545d4c389ca7/kube-rbac-proxy-metric/0.log" Apr 16 14:02:35.856749 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:35.856723 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_00213ce9-5c80-4860-9d58-545d4c389ca7/prom-label-proxy/0.log" Apr 16 14:02:36.854876 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:36.854847 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-84c54b594d-j8qcd_61818717-39e9-433d-91cd-4f4e4264af2c/metrics-server/0.log" Apr 16 14:02:37.056574 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:37.056551 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-sn5zt_449fa6c3-c8bd-4782-8e51-3417426d364f/monitoring-plugin/0.log" Apr 16 14:02:37.257583 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:37.257562 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-42bkv_c452e916-8621-4d4c-aee8-8bf9764fa860/init-textfile/0.log" Apr 16 14:02:37.457233 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:37.457204 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-42bkv_c452e916-8621-4d4c-aee8-8bf9764fa860/node-exporter/0.log" Apr 16 14:02:37.658554 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:37.658487 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-42bkv_c452e916-8621-4d4c-aee8-8bf9764fa860/kube-rbac-proxy/0.log" Apr 16 14:02:39.055311 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:39.055281 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-nzjwm_2bb6e164-2860-4f91-8060-da98bfd9c9be/kube-rbac-proxy-main/0.log" Apr 16 14:02:39.254702 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:39.254658 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-nzjwm_2bb6e164-2860-4f91-8060-da98bfd9c9be/kube-rbac-proxy-self/0.log" Apr 16 14:02:39.294851 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:39.294816 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" podUID="31e70218-76f5-466f-8893-9b596d11423e" containerName="registry" containerID="cri-o://30301edcb0927c2a42d692a1a9f4befdf5c4879c3026fc948e915e8f9ab2f918" gracePeriod=30 Apr 16 14:02:39.455415 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:39.455345 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-nzjwm_2bb6e164-2860-4f91-8060-da98bfd9c9be/openshift-state-metrics/0.log" Apr 16 14:02:40.401139 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.401106 2582 patch_prober.go:28] interesting pod/image-registry-6cb8b4dbdb-w7k9w container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.132.0.6:5000/healthz\": dial tcp 10.132.0.6:5000: connect: connection refused" start-of-body= Apr 16 14:02:40.401526 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.401176 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" podUID="31e70218-76f5-466f-8893-9b596d11423e" containerName="registry" probeResult="failure" output="Get \"https://10.132.0.6:5000/healthz\": dial tcp 10.132.0.6:5000: connect: connection refused" Apr 16 14:02:40.529455 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.529434 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 14:02:40.557847 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.557820 2582 generic.go:358] "Generic (PLEG): container finished" podID="31e70218-76f5-466f-8893-9b596d11423e" containerID="30301edcb0927c2a42d692a1a9f4befdf5c4879c3026fc948e915e8f9ab2f918" exitCode=0 Apr 16 14:02:40.557969 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.557853 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" event={"ID":"31e70218-76f5-466f-8893-9b596d11423e","Type":"ContainerDied","Data":"30301edcb0927c2a42d692a1a9f4befdf5c4879c3026fc948e915e8f9ab2f918"} Apr 16 14:02:40.557969 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.557875 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" Apr 16 14:02:40.557969 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.557896 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w" event={"ID":"31e70218-76f5-466f-8893-9b596d11423e","Type":"ContainerDied","Data":"0dab6c4f7d37da254719faa891159acba10d02f8166d88d1d75981e5535b1eab"} Apr 16 14:02:40.557969 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.557920 2582 scope.go:117] "RemoveContainer" containerID="30301edcb0927c2a42d692a1a9f4befdf5c4879c3026fc948e915e8f9ab2f918" Apr 16 14:02:40.565330 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.565315 2582 scope.go:117] "RemoveContainer" containerID="30301edcb0927c2a42d692a1a9f4befdf5c4879c3026fc948e915e8f9ab2f918" Apr 16 14:02:40.565590 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:02:40.565572 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30301edcb0927c2a42d692a1a9f4befdf5c4879c3026fc948e915e8f9ab2f918\": container with ID starting with 30301edcb0927c2a42d692a1a9f4befdf5c4879c3026fc948e915e8f9ab2f918 not found: ID does not exist" containerID="30301edcb0927c2a42d692a1a9f4befdf5c4879c3026fc948e915e8f9ab2f918" Apr 16 14:02:40.565635 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.565599 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30301edcb0927c2a42d692a1a9f4befdf5c4879c3026fc948e915e8f9ab2f918"} err="failed to get container status \"30301edcb0927c2a42d692a1a9f4befdf5c4879c3026fc948e915e8f9ab2f918\": rpc error: code = NotFound desc = could not find container \"30301edcb0927c2a42d692a1a9f4befdf5c4879c3026fc948e915e8f9ab2f918\": container with ID starting with 30301edcb0927c2a42d692a1a9f4befdf5c4879c3026fc948e915e8f9ab2f918 not found: ID does not exist" Apr 16 14:02:40.586576 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.586551 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/31e70218-76f5-466f-8893-9b596d11423e-registry-certificates\") pod \"31e70218-76f5-466f-8893-9b596d11423e\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " Apr 16 14:02:40.586717 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.586592 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/31e70218-76f5-466f-8893-9b596d11423e-installation-pull-secrets\") pod \"31e70218-76f5-466f-8893-9b596d11423e\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " Apr 16 14:02:40.586717 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.586622 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/31e70218-76f5-466f-8893-9b596d11423e-image-registry-private-configuration\") pod \"31e70218-76f5-466f-8893-9b596d11423e\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " Apr 16 14:02:40.586717 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.586640 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwxx9\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-kube-api-access-vwxx9\") pod \"31e70218-76f5-466f-8893-9b596d11423e\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " Apr 16 14:02:40.586717 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.586662 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-bound-sa-token\") pod \"31e70218-76f5-466f-8893-9b596d11423e\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " Apr 16 14:02:40.586934 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.586749 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31e70218-76f5-466f-8893-9b596d11423e-trusted-ca\") pod \"31e70218-76f5-466f-8893-9b596d11423e\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " Apr 16 14:02:40.586934 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.586780 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls\") pod \"31e70218-76f5-466f-8893-9b596d11423e\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " Apr 16 14:02:40.586934 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.586805 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/31e70218-76f5-466f-8893-9b596d11423e-ca-trust-extracted\") pod \"31e70218-76f5-466f-8893-9b596d11423e\" (UID: \"31e70218-76f5-466f-8893-9b596d11423e\") " Apr 16 14:02:40.587090 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.586998 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e70218-76f5-466f-8893-9b596d11423e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "31e70218-76f5-466f-8893-9b596d11423e" (UID: "31e70218-76f5-466f-8893-9b596d11423e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:40.587405 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.587370 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e70218-76f5-466f-8893-9b596d11423e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "31e70218-76f5-466f-8893-9b596d11423e" (UID: "31e70218-76f5-466f-8893-9b596d11423e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:40.590590 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.589640 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-kube-api-access-vwxx9" (OuterVolumeSpecName: "kube-api-access-vwxx9") pod "31e70218-76f5-466f-8893-9b596d11423e" (UID: "31e70218-76f5-466f-8893-9b596d11423e"). InnerVolumeSpecName "kube-api-access-vwxx9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:02:40.590590 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.589920 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "31e70218-76f5-466f-8893-9b596d11423e" (UID: "31e70218-76f5-466f-8893-9b596d11423e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:02:40.590590 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.590100 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e70218-76f5-466f-8893-9b596d11423e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "31e70218-76f5-466f-8893-9b596d11423e" (UID: "31e70218-76f5-466f-8893-9b596d11423e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:40.590790 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.590704 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e70218-76f5-466f-8893-9b596d11423e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "31e70218-76f5-466f-8893-9b596d11423e" (UID: "31e70218-76f5-466f-8893-9b596d11423e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:40.591128 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.591107 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "31e70218-76f5-466f-8893-9b596d11423e" (UID: "31e70218-76f5-466f-8893-9b596d11423e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:02:40.597121 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.597095 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31e70218-76f5-466f-8893-9b596d11423e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "31e70218-76f5-466f-8893-9b596d11423e" (UID: "31e70218-76f5-466f-8893-9b596d11423e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:02:40.687541 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.687517 2582 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/31e70218-76f5-466f-8893-9b596d11423e-image-registry-private-configuration\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:02:40.687541 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.687540 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vwxx9\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-kube-api-access-vwxx9\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:02:40.687664 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.687551 2582 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-bound-sa-token\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:02:40.687664 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.687561 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31e70218-76f5-466f-8893-9b596d11423e-trusted-ca\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:02:40.687664 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.687570 2582 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/31e70218-76f5-466f-8893-9b596d11423e-registry-tls\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:02:40.687664 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.687578 2582 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/31e70218-76f5-466f-8893-9b596d11423e-ca-trust-extracted\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:02:40.687664 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.687586 2582 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/31e70218-76f5-466f-8893-9b596d11423e-registry-certificates\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:02:40.687664 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.687595 2582 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/31e70218-76f5-466f-8893-9b596d11423e-installation-pull-secrets\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:02:40.880728 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.880701 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w"] Apr 16 14:02:40.886818 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.886793 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6cb8b4dbdb-w7k9w"] Apr 16 14:02:40.938980 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:40.938959 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e70218-76f5-466f-8893-9b596d11423e" path="/var/lib/kubelet/pods/31e70218-76f5-466f-8893-9b596d11423e/volumes" Apr 16 14:02:41.454215 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:41.454186 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-4bp2n_e8c815b4-e7d9-4b96-a516-7a00cc1a2578/prometheus-operator-admission-webhook/0.log" Apr 16 14:02:41.655174 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:41.655149 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7759bb9fbf-bph6s_0d55217e-e9ef-473d-9e26-0468e457f308/thanos-query/0.log" Apr 16 14:02:41.855293 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:41.855269 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7759bb9fbf-bph6s_0d55217e-e9ef-473d-9e26-0468e457f308/kube-rbac-proxy-web/0.log" Apr 16 14:02:42.055595 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:42.055564 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7759bb9fbf-bph6s_0d55217e-e9ef-473d-9e26-0468e457f308/kube-rbac-proxy/0.log" Apr 16 14:02:42.257378 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:42.257355 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7759bb9fbf-bph6s_0d55217e-e9ef-473d-9e26-0468e457f308/prom-label-proxy/0.log" Apr 16 14:02:42.454406 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:42.454379 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7759bb9fbf-bph6s_0d55217e-e9ef-473d-9e26-0468e457f308/kube-rbac-proxy-rules/0.log" Apr 16 14:02:42.654221 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:42.654130 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7759bb9fbf-bph6s_0d55217e-e9ef-473d-9e26-0468e457f308/kube-rbac-proxy-metrics/0.log" Apr 16 14:02:43.655852 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:43.655823 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c59d6c579-gftxh_560a89dd-7638-47e3-9703-3c403ab2ff34/console/0.log" Apr 16 14:02:44.654546 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:44.654519 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sqfgf_56fe5eb2-ae67-4d8a-a719-f51bf68da0d0/node-ca/0.log" Apr 16 14:02:45.855435 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:45.855401 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vf79m_41223147-714d-4ec2-a7b7-5febd776c247/serve-healthcheck-canary/0.log" Apr 16 14:02:49.991498 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:49.991455 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c59d6c579-gftxh" podUID="560a89dd-7638-47e3-9703-3c403ab2ff34" containerName="console" containerID="cri-o://3f1bea9b95d3ac514f63e962973e968ccad99c1716565993a73eec2debef7db7" gracePeriod=15 Apr 16 14:02:50.225073 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.225048 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c59d6c579-gftxh_560a89dd-7638-47e3-9703-3c403ab2ff34/console/0.log" Apr 16 14:02:50.225188 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.225109 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:50.264719 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.264644 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/560a89dd-7638-47e3-9703-3c403ab2ff34-service-ca\") pod \"560a89dd-7638-47e3-9703-3c403ab2ff34\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " Apr 16 14:02:50.264719 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.264695 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/560a89dd-7638-47e3-9703-3c403ab2ff34-console-serving-cert\") pod \"560a89dd-7638-47e3-9703-3c403ab2ff34\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " Apr 16 14:02:50.264856 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.264726 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/560a89dd-7638-47e3-9703-3c403ab2ff34-console-config\") pod \"560a89dd-7638-47e3-9703-3c403ab2ff34\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " Apr 16 14:02:50.264856 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.264785 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsghd\" (UniqueName: \"kubernetes.io/projected/560a89dd-7638-47e3-9703-3c403ab2ff34-kube-api-access-bsghd\") pod \"560a89dd-7638-47e3-9703-3c403ab2ff34\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " Apr 16 14:02:50.264856 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.264828 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/560a89dd-7638-47e3-9703-3c403ab2ff34-oauth-serving-cert\") pod \"560a89dd-7638-47e3-9703-3c403ab2ff34\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " Apr 16 14:02:50.265003 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.264877 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/560a89dd-7638-47e3-9703-3c403ab2ff34-console-oauth-config\") pod \"560a89dd-7638-47e3-9703-3c403ab2ff34\" (UID: \"560a89dd-7638-47e3-9703-3c403ab2ff34\") " Apr 16 14:02:50.265157 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.265121 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/560a89dd-7638-47e3-9703-3c403ab2ff34-service-ca" (OuterVolumeSpecName: "service-ca") pod "560a89dd-7638-47e3-9703-3c403ab2ff34" (UID: "560a89dd-7638-47e3-9703-3c403ab2ff34"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:50.265227 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.265147 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/560a89dd-7638-47e3-9703-3c403ab2ff34-console-config" (OuterVolumeSpecName: "console-config") pod "560a89dd-7638-47e3-9703-3c403ab2ff34" (UID: "560a89dd-7638-47e3-9703-3c403ab2ff34"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:50.265276 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.265236 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/560a89dd-7638-47e3-9703-3c403ab2ff34-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "560a89dd-7638-47e3-9703-3c403ab2ff34" (UID: "560a89dd-7638-47e3-9703-3c403ab2ff34"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:50.266920 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.266893 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/560a89dd-7638-47e3-9703-3c403ab2ff34-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "560a89dd-7638-47e3-9703-3c403ab2ff34" (UID: "560a89dd-7638-47e3-9703-3c403ab2ff34"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:50.267014 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.266925 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/560a89dd-7638-47e3-9703-3c403ab2ff34-kube-api-access-bsghd" (OuterVolumeSpecName: "kube-api-access-bsghd") pod "560a89dd-7638-47e3-9703-3c403ab2ff34" (UID: "560a89dd-7638-47e3-9703-3c403ab2ff34"). InnerVolumeSpecName "kube-api-access-bsghd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:02:50.267014 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.266974 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/560a89dd-7638-47e3-9703-3c403ab2ff34-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "560a89dd-7638-47e3-9703-3c403ab2ff34" (UID: "560a89dd-7638-47e3-9703-3c403ab2ff34"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:50.366209 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.366186 2582 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/560a89dd-7638-47e3-9703-3c403ab2ff34-console-oauth-config\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:02:50.366209 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.366208 2582 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/560a89dd-7638-47e3-9703-3c403ab2ff34-service-ca\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:02:50.366325 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.366219 2582 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/560a89dd-7638-47e3-9703-3c403ab2ff34-console-serving-cert\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:02:50.366325 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.366227 2582 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/560a89dd-7638-47e3-9703-3c403ab2ff34-console-config\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:02:50.366325 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.366236 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bsghd\" (UniqueName: \"kubernetes.io/projected/560a89dd-7638-47e3-9703-3c403ab2ff34-kube-api-access-bsghd\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:02:50.366325 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.366246 2582 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/560a89dd-7638-47e3-9703-3c403ab2ff34-oauth-serving-cert\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:02:50.591029 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.590981 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c59d6c579-gftxh_560a89dd-7638-47e3-9703-3c403ab2ff34/console/0.log" Apr 16 14:02:50.591029 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.591016 2582 generic.go:358] "Generic (PLEG): container finished" podID="560a89dd-7638-47e3-9703-3c403ab2ff34" containerID="3f1bea9b95d3ac514f63e962973e968ccad99c1716565993a73eec2debef7db7" exitCode=2 Apr 16 14:02:50.591141 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.591046 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c59d6c579-gftxh" event={"ID":"560a89dd-7638-47e3-9703-3c403ab2ff34","Type":"ContainerDied","Data":"3f1bea9b95d3ac514f63e962973e968ccad99c1716565993a73eec2debef7db7"} Apr 16 14:02:50.591141 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.591074 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c59d6c579-gftxh" Apr 16 14:02:50.591141 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.591086 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c59d6c579-gftxh" event={"ID":"560a89dd-7638-47e3-9703-3c403ab2ff34","Type":"ContainerDied","Data":"dae694c394cd9f98c62994324622a5be3fe530b4121ad5ee313ca158aba47ef0"} Apr 16 14:02:50.591141 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.591103 2582 scope.go:117] "RemoveContainer" containerID="3f1bea9b95d3ac514f63e962973e968ccad99c1716565993a73eec2debef7db7" Apr 16 14:02:50.599454 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.599438 2582 scope.go:117] "RemoveContainer" containerID="3f1bea9b95d3ac514f63e962973e968ccad99c1716565993a73eec2debef7db7" Apr 16 14:02:50.601044 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:02:50.599761 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f1bea9b95d3ac514f63e962973e968ccad99c1716565993a73eec2debef7db7\": container with ID starting with 3f1bea9b95d3ac514f63e962973e968ccad99c1716565993a73eec2debef7db7 not found: ID does not exist" containerID="3f1bea9b95d3ac514f63e962973e968ccad99c1716565993a73eec2debef7db7" Apr 16 14:02:50.601044 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.599800 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f1bea9b95d3ac514f63e962973e968ccad99c1716565993a73eec2debef7db7"} err="failed to get container status \"3f1bea9b95d3ac514f63e962973e968ccad99c1716565993a73eec2debef7db7\": rpc error: code = NotFound desc = could not find container \"3f1bea9b95d3ac514f63e962973e968ccad99c1716565993a73eec2debef7db7\": container with ID starting with 3f1bea9b95d3ac514f63e962973e968ccad99c1716565993a73eec2debef7db7 not found: ID does not exist" Apr 16 14:02:50.617113 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.617095 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c59d6c579-gftxh"] Apr 16 14:02:50.622090 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.622073 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c59d6c579-gftxh"] Apr 16 14:02:50.938804 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:02:50.938746 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="560a89dd-7638-47e3-9703-3c403ab2ff34" path="/var/lib/kubelet/pods/560a89dd-7638-47e3-9703-3c403ab2ff34/volumes" Apr 16 14:03:00.942131 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:00.942097 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:03:00.942620 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:00.942472 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="alertmanager" containerID="cri-o://9f5a92a85ff9325f826e33893bb7e9561e9edace63d1d40f250a716fc903de66" gracePeriod=120 Apr 16 14:03:00.942620 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:00.942548 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="kube-rbac-proxy-web" containerID="cri-o://ebdbc12e256bd71f9d856d0e44e35705ed37dd2fb2f5b7074cfdf9576b0e6f6c" gracePeriod=120 Apr 16 14:03:00.942620 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:00.942535 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="kube-rbac-proxy-metric" containerID="cri-o://79d24c810655135e1a33d3218ff3259050e0bee8a2569354636c55ac452b5ef1" gracePeriod=120 Apr 16 14:03:00.942620 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:00.942568 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="config-reloader" containerID="cri-o://7efba38b1c31dccfd25122e71e23fe2ae303718806915fd37b40ff000030ee02" gracePeriod=120 Apr 16 14:03:00.942875 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:00.942625 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="kube-rbac-proxy" containerID="cri-o://51b7e013402397b72e677a90faefaa9f0cbaa67170f733f954a1feb039e5ed69" gracePeriod=120 Apr 16 14:03:00.942875 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:00.942636 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="prom-label-proxy" containerID="cri-o://f53e14ecb0b6b8c3967cbe1152cc4ba85517255f8bdc12d49ae7964718c645b1" gracePeriod=120 Apr 16 14:03:01.629178 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:01.629146 2582 generic.go:358] "Generic (PLEG): container finished" podID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerID="f53e14ecb0b6b8c3967cbe1152cc4ba85517255f8bdc12d49ae7964718c645b1" exitCode=0 Apr 16 14:03:01.629178 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:01.629171 2582 generic.go:358] "Generic (PLEG): container finished" podID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerID="51b7e013402397b72e677a90faefaa9f0cbaa67170f733f954a1feb039e5ed69" exitCode=0 Apr 16 14:03:01.629178 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:01.629180 2582 generic.go:358] "Generic (PLEG): container finished" podID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerID="7efba38b1c31dccfd25122e71e23fe2ae303718806915fd37b40ff000030ee02" exitCode=0 Apr 16 14:03:01.629178 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:01.629186 2582 generic.go:358] "Generic (PLEG): container finished" podID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerID="9f5a92a85ff9325f826e33893bb7e9561e9edace63d1d40f250a716fc903de66" exitCode=0 Apr 16 14:03:01.629429 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:01.629218 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00213ce9-5c80-4860-9d58-545d4c389ca7","Type":"ContainerDied","Data":"f53e14ecb0b6b8c3967cbe1152cc4ba85517255f8bdc12d49ae7964718c645b1"} Apr 16 14:03:01.629429 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:01.629252 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00213ce9-5c80-4860-9d58-545d4c389ca7","Type":"ContainerDied","Data":"51b7e013402397b72e677a90faefaa9f0cbaa67170f733f954a1feb039e5ed69"} Apr 16 14:03:01.629429 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:01.629262 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00213ce9-5c80-4860-9d58-545d4c389ca7","Type":"ContainerDied","Data":"7efba38b1c31dccfd25122e71e23fe2ae303718806915fd37b40ff000030ee02"} Apr 16 14:03:01.629429 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:01.629271 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00213ce9-5c80-4860-9d58-545d4c389ca7","Type":"ContainerDied","Data":"9f5a92a85ff9325f826e33893bb7e9561e9edace63d1d40f250a716fc903de66"} Apr 16 14:03:02.180548 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.180524 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.253171 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.253142 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-config-volume\") pod \"00213ce9-5c80-4860-9d58-545d4c389ca7\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " Apr 16 14:03:02.253338 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.253181 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00213ce9-5c80-4860-9d58-545d4c389ca7-alertmanager-trusted-ca-bundle\") pod \"00213ce9-5c80-4860-9d58-545d4c389ca7\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " Apr 16 14:03:02.253338 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.253212 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"00213ce9-5c80-4860-9d58-545d4c389ca7\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " Apr 16 14:03:02.253338 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.253249 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/00213ce9-5c80-4860-9d58-545d4c389ca7-tls-assets\") pod \"00213ce9-5c80-4860-9d58-545d4c389ca7\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " Apr 16 14:03:02.253338 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.253267 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-main-tls\") pod \"00213ce9-5c80-4860-9d58-545d4c389ca7\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " Apr 16 14:03:02.253338 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.253289 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/00213ce9-5c80-4860-9d58-545d4c389ca7-alertmanager-main-db\") pod \"00213ce9-5c80-4860-9d58-545d4c389ca7\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " Apr 16 14:03:02.253338 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.253312 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-web-config\") pod \"00213ce9-5c80-4860-9d58-545d4c389ca7\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " Apr 16 14:03:02.253338 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.253336 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-kube-rbac-proxy-web\") pod \"00213ce9-5c80-4860-9d58-545d4c389ca7\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " Apr 16 14:03:02.253711 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.253359 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00213ce9-5c80-4860-9d58-545d4c389ca7-metrics-client-ca\") pod \"00213ce9-5c80-4860-9d58-545d4c389ca7\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " Apr 16 14:03:02.253711 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.253392 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-cluster-tls-config\") pod \"00213ce9-5c80-4860-9d58-545d4c389ca7\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " Apr 16 14:03:02.253711 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.253434 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/00213ce9-5c80-4860-9d58-545d4c389ca7-config-out\") pod \"00213ce9-5c80-4860-9d58-545d4c389ca7\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " Apr 16 14:03:02.253711 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.253467 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-kube-rbac-proxy\") pod \"00213ce9-5c80-4860-9d58-545d4c389ca7\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " Apr 16 14:03:02.253711 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.253499 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbg6r\" (UniqueName: \"kubernetes.io/projected/00213ce9-5c80-4860-9d58-545d4c389ca7-kube-api-access-tbg6r\") pod \"00213ce9-5c80-4860-9d58-545d4c389ca7\" (UID: \"00213ce9-5c80-4860-9d58-545d4c389ca7\") " Apr 16 14:03:02.253711 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.253607 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00213ce9-5c80-4860-9d58-545d4c389ca7-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "00213ce9-5c80-4860-9d58-545d4c389ca7" (UID: "00213ce9-5c80-4860-9d58-545d4c389ca7"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:02.254009 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.253780 2582 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00213ce9-5c80-4860-9d58-545d4c389ca7-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.256260 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.255978 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-config-volume" (OuterVolumeSpecName: "config-volume") pod "00213ce9-5c80-4860-9d58-545d4c389ca7" (UID: "00213ce9-5c80-4860-9d58-545d4c389ca7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:02.256260 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.256060 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00213ce9-5c80-4860-9d58-545d4c389ca7-config-out" (OuterVolumeSpecName: "config-out") pod "00213ce9-5c80-4860-9d58-545d4c389ca7" (UID: "00213ce9-5c80-4860-9d58-545d4c389ca7"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:03:02.256260 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.256071 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00213ce9-5c80-4860-9d58-545d4c389ca7-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "00213ce9-5c80-4860-9d58-545d4c389ca7" (UID: "00213ce9-5c80-4860-9d58-545d4c389ca7"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:03:02.256473 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.256401 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "00213ce9-5c80-4860-9d58-545d4c389ca7" (UID: "00213ce9-5c80-4860-9d58-545d4c389ca7"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:02.256759 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.256671 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00213ce9-5c80-4860-9d58-545d4c389ca7-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "00213ce9-5c80-4860-9d58-545d4c389ca7" (UID: "00213ce9-5c80-4860-9d58-545d4c389ca7"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:03:02.256759 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.256713 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00213ce9-5c80-4860-9d58-545d4c389ca7-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "00213ce9-5c80-4860-9d58-545d4c389ca7" (UID: "00213ce9-5c80-4860-9d58-545d4c389ca7"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:02.257173 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.257141 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "00213ce9-5c80-4860-9d58-545d4c389ca7" (UID: "00213ce9-5c80-4860-9d58-545d4c389ca7"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:02.257285 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.257236 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "00213ce9-5c80-4860-9d58-545d4c389ca7" (UID: "00213ce9-5c80-4860-9d58-545d4c389ca7"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:02.258049 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.258023 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00213ce9-5c80-4860-9d58-545d4c389ca7-kube-api-access-tbg6r" (OuterVolumeSpecName: "kube-api-access-tbg6r") pod "00213ce9-5c80-4860-9d58-545d4c389ca7" (UID: "00213ce9-5c80-4860-9d58-545d4c389ca7"). InnerVolumeSpecName "kube-api-access-tbg6r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:03:02.258444 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.258427 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "00213ce9-5c80-4860-9d58-545d4c389ca7" (UID: "00213ce9-5c80-4860-9d58-545d4c389ca7"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:02.261125 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.261074 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "00213ce9-5c80-4860-9d58-545d4c389ca7" (UID: "00213ce9-5c80-4860-9d58-545d4c389ca7"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:02.267190 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.267166 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-web-config" (OuterVolumeSpecName: "web-config") pod "00213ce9-5c80-4860-9d58-545d4c389ca7" (UID: "00213ce9-5c80-4860-9d58-545d4c389ca7"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:02.354527 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.354503 2582 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/00213ce9-5c80-4860-9d58-545d4c389ca7-alertmanager-main-db\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.354527 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.354527 2582 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-web-config\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.354652 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.354540 2582 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.354652 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.354550 2582 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00213ce9-5c80-4860-9d58-545d4c389ca7-metrics-client-ca\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.354652 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.354559 2582 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-cluster-tls-config\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.354652 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.354568 2582 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/00213ce9-5c80-4860-9d58-545d4c389ca7-config-out\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.354652 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.354577 2582 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.354652 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.354586 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tbg6r\" (UniqueName: \"kubernetes.io/projected/00213ce9-5c80-4860-9d58-545d4c389ca7-kube-api-access-tbg6r\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.354652 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.354595 2582 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-config-volume\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.354652 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.354614 2582 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.354652 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.354624 2582 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/00213ce9-5c80-4860-9d58-545d4c389ca7-tls-assets\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.354652 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.354633 2582 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/00213ce9-5c80-4860-9d58-545d4c389ca7-secret-alertmanager-main-tls\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:03:02.634434 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.634352 2582 generic.go:358] "Generic (PLEG): container finished" podID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerID="79d24c810655135e1a33d3218ff3259050e0bee8a2569354636c55ac452b5ef1" exitCode=0 Apr 16 14:03:02.634434 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.634378 2582 generic.go:358] "Generic (PLEG): container finished" podID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerID="ebdbc12e256bd71f9d856d0e44e35705ed37dd2fb2f5b7074cfdf9576b0e6f6c" exitCode=0 Apr 16 14:03:02.634434 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.634411 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00213ce9-5c80-4860-9d58-545d4c389ca7","Type":"ContainerDied","Data":"79d24c810655135e1a33d3218ff3259050e0bee8a2569354636c55ac452b5ef1"} Apr 16 14:03:02.634658 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.634445 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00213ce9-5c80-4860-9d58-545d4c389ca7","Type":"ContainerDied","Data":"ebdbc12e256bd71f9d856d0e44e35705ed37dd2fb2f5b7074cfdf9576b0e6f6c"} Apr 16 14:03:02.634658 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.634460 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00213ce9-5c80-4860-9d58-545d4c389ca7","Type":"ContainerDied","Data":"b90c32d3d7ee8aed9efdd6d99a8fb2674fccf116a69dcdfd170f0d1dbf4b641d"} Apr 16 14:03:02.634658 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.634480 2582 scope.go:117] "RemoveContainer" containerID="f53e14ecb0b6b8c3967cbe1152cc4ba85517255f8bdc12d49ae7964718c645b1" Apr 16 14:03:02.634658 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.634481 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.641617 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.641598 2582 scope.go:117] "RemoveContainer" containerID="79d24c810655135e1a33d3218ff3259050e0bee8a2569354636c55ac452b5ef1" Apr 16 14:03:02.647955 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.647941 2582 scope.go:117] "RemoveContainer" containerID="51b7e013402397b72e677a90faefaa9f0cbaa67170f733f954a1feb039e5ed69" Apr 16 14:03:02.653981 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.653963 2582 scope.go:117] "RemoveContainer" containerID="ebdbc12e256bd71f9d856d0e44e35705ed37dd2fb2f5b7074cfdf9576b0e6f6c" Apr 16 14:03:02.658436 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.658413 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:03:02.660294 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.660265 2582 scope.go:117] "RemoveContainer" containerID="7efba38b1c31dccfd25122e71e23fe2ae303718806915fd37b40ff000030ee02" Apr 16 14:03:02.663506 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.663483 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:03:02.667222 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.667204 2582 scope.go:117] "RemoveContainer" containerID="9f5a92a85ff9325f826e33893bb7e9561e9edace63d1d40f250a716fc903de66" Apr 16 14:03:02.673386 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.673369 2582 scope.go:117] "RemoveContainer" containerID="e726b98f193f22bfd01bfec1181f7c8ffbd304e806fda7ca59ab8895900b4cd5" Apr 16 14:03:02.679370 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.679355 2582 scope.go:117] "RemoveContainer" containerID="f53e14ecb0b6b8c3967cbe1152cc4ba85517255f8bdc12d49ae7964718c645b1" Apr 16 14:03:02.679596 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:03:02.679577 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f53e14ecb0b6b8c3967cbe1152cc4ba85517255f8bdc12d49ae7964718c645b1\": container with ID starting with f53e14ecb0b6b8c3967cbe1152cc4ba85517255f8bdc12d49ae7964718c645b1 not found: ID does not exist" containerID="f53e14ecb0b6b8c3967cbe1152cc4ba85517255f8bdc12d49ae7964718c645b1" Apr 16 14:03:02.679647 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.679605 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f53e14ecb0b6b8c3967cbe1152cc4ba85517255f8bdc12d49ae7964718c645b1"} err="failed to get container status \"f53e14ecb0b6b8c3967cbe1152cc4ba85517255f8bdc12d49ae7964718c645b1\": rpc error: code = NotFound desc = could not find container \"f53e14ecb0b6b8c3967cbe1152cc4ba85517255f8bdc12d49ae7964718c645b1\": container with ID starting with f53e14ecb0b6b8c3967cbe1152cc4ba85517255f8bdc12d49ae7964718c645b1 not found: ID does not exist" Apr 16 14:03:02.679647 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.679624 2582 scope.go:117] "RemoveContainer" containerID="79d24c810655135e1a33d3218ff3259050e0bee8a2569354636c55ac452b5ef1" Apr 16 14:03:02.679873 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:03:02.679857 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79d24c810655135e1a33d3218ff3259050e0bee8a2569354636c55ac452b5ef1\": container with ID starting with 79d24c810655135e1a33d3218ff3259050e0bee8a2569354636c55ac452b5ef1 not found: ID does not exist" containerID="79d24c810655135e1a33d3218ff3259050e0bee8a2569354636c55ac452b5ef1" Apr 16 14:03:02.679914 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.679880 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79d24c810655135e1a33d3218ff3259050e0bee8a2569354636c55ac452b5ef1"} err="failed to get container status \"79d24c810655135e1a33d3218ff3259050e0bee8a2569354636c55ac452b5ef1\": rpc error: code = NotFound desc = could not find container \"79d24c810655135e1a33d3218ff3259050e0bee8a2569354636c55ac452b5ef1\": container with ID starting with 79d24c810655135e1a33d3218ff3259050e0bee8a2569354636c55ac452b5ef1 not found: ID does not exist" Apr 16 14:03:02.679914 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.679899 2582 scope.go:117] "RemoveContainer" containerID="51b7e013402397b72e677a90faefaa9f0cbaa67170f733f954a1feb039e5ed69" Apr 16 14:03:02.680135 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:03:02.680116 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51b7e013402397b72e677a90faefaa9f0cbaa67170f733f954a1feb039e5ed69\": container with ID starting with 51b7e013402397b72e677a90faefaa9f0cbaa67170f733f954a1feb039e5ed69 not found: ID does not exist" containerID="51b7e013402397b72e677a90faefaa9f0cbaa67170f733f954a1feb039e5ed69" Apr 16 14:03:02.680198 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.680143 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b7e013402397b72e677a90faefaa9f0cbaa67170f733f954a1feb039e5ed69"} err="failed to get container status \"51b7e013402397b72e677a90faefaa9f0cbaa67170f733f954a1feb039e5ed69\": rpc error: code = NotFound desc = could not find container \"51b7e013402397b72e677a90faefaa9f0cbaa67170f733f954a1feb039e5ed69\": container with ID starting with 51b7e013402397b72e677a90faefaa9f0cbaa67170f733f954a1feb039e5ed69 not found: ID does not exist" Apr 16 14:03:02.680198 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.680164 2582 scope.go:117] "RemoveContainer" containerID="ebdbc12e256bd71f9d856d0e44e35705ed37dd2fb2f5b7074cfdf9576b0e6f6c" Apr 16 14:03:02.680403 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:03:02.680386 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebdbc12e256bd71f9d856d0e44e35705ed37dd2fb2f5b7074cfdf9576b0e6f6c\": container with ID starting with ebdbc12e256bd71f9d856d0e44e35705ed37dd2fb2f5b7074cfdf9576b0e6f6c not found: ID does not exist" containerID="ebdbc12e256bd71f9d856d0e44e35705ed37dd2fb2f5b7074cfdf9576b0e6f6c" Apr 16 14:03:02.680440 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.680407 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebdbc12e256bd71f9d856d0e44e35705ed37dd2fb2f5b7074cfdf9576b0e6f6c"} err="failed to get container status \"ebdbc12e256bd71f9d856d0e44e35705ed37dd2fb2f5b7074cfdf9576b0e6f6c\": rpc error: code = NotFound desc = could not find container \"ebdbc12e256bd71f9d856d0e44e35705ed37dd2fb2f5b7074cfdf9576b0e6f6c\": container with ID starting with ebdbc12e256bd71f9d856d0e44e35705ed37dd2fb2f5b7074cfdf9576b0e6f6c not found: ID does not exist" Apr 16 14:03:02.680440 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.680421 2582 scope.go:117] "RemoveContainer" containerID="7efba38b1c31dccfd25122e71e23fe2ae303718806915fd37b40ff000030ee02" Apr 16 14:03:02.680634 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:03:02.680618 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7efba38b1c31dccfd25122e71e23fe2ae303718806915fd37b40ff000030ee02\": container with ID starting with 7efba38b1c31dccfd25122e71e23fe2ae303718806915fd37b40ff000030ee02 not found: ID does not exist" containerID="7efba38b1c31dccfd25122e71e23fe2ae303718806915fd37b40ff000030ee02" Apr 16 14:03:02.680772 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.680639 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7efba38b1c31dccfd25122e71e23fe2ae303718806915fd37b40ff000030ee02"} err="failed to get container status \"7efba38b1c31dccfd25122e71e23fe2ae303718806915fd37b40ff000030ee02\": rpc error: code = NotFound desc = could not find container \"7efba38b1c31dccfd25122e71e23fe2ae303718806915fd37b40ff000030ee02\": container with ID starting with 7efba38b1c31dccfd25122e71e23fe2ae303718806915fd37b40ff000030ee02 not found: ID does not exist" Apr 16 14:03:02.680772 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.680658 2582 scope.go:117] "RemoveContainer" containerID="9f5a92a85ff9325f826e33893bb7e9561e9edace63d1d40f250a716fc903de66" Apr 16 14:03:02.680887 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:03:02.680865 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f5a92a85ff9325f826e33893bb7e9561e9edace63d1d40f250a716fc903de66\": container with ID starting with 9f5a92a85ff9325f826e33893bb7e9561e9edace63d1d40f250a716fc903de66 not found: ID does not exist" containerID="9f5a92a85ff9325f826e33893bb7e9561e9edace63d1d40f250a716fc903de66" Apr 16 14:03:02.680929 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.680894 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f5a92a85ff9325f826e33893bb7e9561e9edace63d1d40f250a716fc903de66"} err="failed to get container status \"9f5a92a85ff9325f826e33893bb7e9561e9edace63d1d40f250a716fc903de66\": rpc error: code = NotFound desc = could not find container \"9f5a92a85ff9325f826e33893bb7e9561e9edace63d1d40f250a716fc903de66\": container with ID starting with 9f5a92a85ff9325f826e33893bb7e9561e9edace63d1d40f250a716fc903de66 not found: ID does not exist" Apr 16 14:03:02.680929 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.680910 2582 scope.go:117] "RemoveContainer" containerID="e726b98f193f22bfd01bfec1181f7c8ffbd304e806fda7ca59ab8895900b4cd5" Apr 16 14:03:02.681128 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:03:02.681111 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e726b98f193f22bfd01bfec1181f7c8ffbd304e806fda7ca59ab8895900b4cd5\": container with ID starting with e726b98f193f22bfd01bfec1181f7c8ffbd304e806fda7ca59ab8895900b4cd5 not found: ID does not exist" containerID="e726b98f193f22bfd01bfec1181f7c8ffbd304e806fda7ca59ab8895900b4cd5" Apr 16 14:03:02.681163 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.681133 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e726b98f193f22bfd01bfec1181f7c8ffbd304e806fda7ca59ab8895900b4cd5"} err="failed to get container status \"e726b98f193f22bfd01bfec1181f7c8ffbd304e806fda7ca59ab8895900b4cd5\": rpc error: code = NotFound desc = could not find container \"e726b98f193f22bfd01bfec1181f7c8ffbd304e806fda7ca59ab8895900b4cd5\": container with ID starting with e726b98f193f22bfd01bfec1181f7c8ffbd304e806fda7ca59ab8895900b4cd5 not found: ID does not exist" Apr 16 14:03:02.681163 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.681147 2582 scope.go:117] "RemoveContainer" containerID="f53e14ecb0b6b8c3967cbe1152cc4ba85517255f8bdc12d49ae7964718c645b1" Apr 16 14:03:02.681365 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.681349 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f53e14ecb0b6b8c3967cbe1152cc4ba85517255f8bdc12d49ae7964718c645b1"} err="failed to get container status \"f53e14ecb0b6b8c3967cbe1152cc4ba85517255f8bdc12d49ae7964718c645b1\": rpc error: code = NotFound desc = could not find container \"f53e14ecb0b6b8c3967cbe1152cc4ba85517255f8bdc12d49ae7964718c645b1\": container with ID starting with f53e14ecb0b6b8c3967cbe1152cc4ba85517255f8bdc12d49ae7964718c645b1 not found: ID does not exist" Apr 16 14:03:02.681410 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.681366 2582 scope.go:117] "RemoveContainer" containerID="79d24c810655135e1a33d3218ff3259050e0bee8a2569354636c55ac452b5ef1" Apr 16 14:03:02.681555 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.681538 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79d24c810655135e1a33d3218ff3259050e0bee8a2569354636c55ac452b5ef1"} err="failed to get container status \"79d24c810655135e1a33d3218ff3259050e0bee8a2569354636c55ac452b5ef1\": rpc error: code = NotFound desc = could not find container \"79d24c810655135e1a33d3218ff3259050e0bee8a2569354636c55ac452b5ef1\": container with ID starting with 79d24c810655135e1a33d3218ff3259050e0bee8a2569354636c55ac452b5ef1 not found: ID does not exist" Apr 16 14:03:02.681602 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.681555 2582 scope.go:117] "RemoveContainer" containerID="51b7e013402397b72e677a90faefaa9f0cbaa67170f733f954a1feb039e5ed69" Apr 16 14:03:02.681792 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.681773 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b7e013402397b72e677a90faefaa9f0cbaa67170f733f954a1feb039e5ed69"} err="failed to get container status \"51b7e013402397b72e677a90faefaa9f0cbaa67170f733f954a1feb039e5ed69\": rpc error: code = NotFound desc = could not find container \"51b7e013402397b72e677a90faefaa9f0cbaa67170f733f954a1feb039e5ed69\": container with ID starting with 51b7e013402397b72e677a90faefaa9f0cbaa67170f733f954a1feb039e5ed69 not found: ID does not exist" Apr 16 14:03:02.681836 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.681793 2582 scope.go:117] "RemoveContainer" containerID="ebdbc12e256bd71f9d856d0e44e35705ed37dd2fb2f5b7074cfdf9576b0e6f6c" Apr 16 14:03:02.682006 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.681989 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebdbc12e256bd71f9d856d0e44e35705ed37dd2fb2f5b7074cfdf9576b0e6f6c"} err="failed to get container status \"ebdbc12e256bd71f9d856d0e44e35705ed37dd2fb2f5b7074cfdf9576b0e6f6c\": rpc error: code = NotFound desc = could not find container \"ebdbc12e256bd71f9d856d0e44e35705ed37dd2fb2f5b7074cfdf9576b0e6f6c\": container with ID starting with ebdbc12e256bd71f9d856d0e44e35705ed37dd2fb2f5b7074cfdf9576b0e6f6c not found: ID does not exist" Apr 16 14:03:02.682045 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.682007 2582 scope.go:117] "RemoveContainer" containerID="7efba38b1c31dccfd25122e71e23fe2ae303718806915fd37b40ff000030ee02" Apr 16 14:03:02.682233 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.682216 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7efba38b1c31dccfd25122e71e23fe2ae303718806915fd37b40ff000030ee02"} err="failed to get container status \"7efba38b1c31dccfd25122e71e23fe2ae303718806915fd37b40ff000030ee02\": rpc error: code = NotFound desc = could not find container \"7efba38b1c31dccfd25122e71e23fe2ae303718806915fd37b40ff000030ee02\": container with ID starting with 7efba38b1c31dccfd25122e71e23fe2ae303718806915fd37b40ff000030ee02 not found: ID does not exist" Apr 16 14:03:02.682272 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.682233 2582 scope.go:117] "RemoveContainer" containerID="9f5a92a85ff9325f826e33893bb7e9561e9edace63d1d40f250a716fc903de66" Apr 16 14:03:02.682453 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.682436 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f5a92a85ff9325f826e33893bb7e9561e9edace63d1d40f250a716fc903de66"} err="failed to get container status \"9f5a92a85ff9325f826e33893bb7e9561e9edace63d1d40f250a716fc903de66\": rpc error: code = NotFound desc = could not find container \"9f5a92a85ff9325f826e33893bb7e9561e9edace63d1d40f250a716fc903de66\": container with ID starting with 9f5a92a85ff9325f826e33893bb7e9561e9edace63d1d40f250a716fc903de66 not found: ID does not exist" Apr 16 14:03:02.682498 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.682453 2582 scope.go:117] "RemoveContainer" containerID="e726b98f193f22bfd01bfec1181f7c8ffbd304e806fda7ca59ab8895900b4cd5" Apr 16 14:03:02.682647 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.682624 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e726b98f193f22bfd01bfec1181f7c8ffbd304e806fda7ca59ab8895900b4cd5"} err="failed to get container status \"e726b98f193f22bfd01bfec1181f7c8ffbd304e806fda7ca59ab8895900b4cd5\": rpc error: code = NotFound desc = could not find container \"e726b98f193f22bfd01bfec1181f7c8ffbd304e806fda7ca59ab8895900b4cd5\": container with ID starting with e726b98f193f22bfd01bfec1181f7c8ffbd304e806fda7ca59ab8895900b4cd5 not found: ID does not exist" Apr 16 14:03:02.695050 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695029 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:03:02.695317 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695305 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="init-config-reloader" Apr 16 14:03:02.695357 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695319 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="init-config-reloader" Apr 16 14:03:02.695357 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695326 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="prom-label-proxy" Apr 16 14:03:02.695357 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695332 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="prom-label-proxy" Apr 16 14:03:02.695357 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695343 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="alertmanager" Apr 16 14:03:02.695357 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695349 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="alertmanager" Apr 16 14:03:02.695497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695359 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="kube-rbac-proxy" Apr 16 14:03:02.695497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695365 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="kube-rbac-proxy" Apr 16 14:03:02.695497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695375 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="kube-rbac-proxy-metric" Apr 16 14:03:02.695497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695380 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="kube-rbac-proxy-metric" Apr 16 14:03:02.695497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695391 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="kube-rbac-proxy-web" Apr 16 14:03:02.695497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695396 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="kube-rbac-proxy-web" Apr 16 14:03:02.695497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695401 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="560a89dd-7638-47e3-9703-3c403ab2ff34" containerName="console" Apr 16 14:03:02.695497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695406 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="560a89dd-7638-47e3-9703-3c403ab2ff34" containerName="console" Apr 16 14:03:02.695497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695411 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31e70218-76f5-466f-8893-9b596d11423e" containerName="registry" Apr 16 14:03:02.695497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695416 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e70218-76f5-466f-8893-9b596d11423e" containerName="registry" Apr 16 14:03:02.695497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695421 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="config-reloader" Apr 16 14:03:02.695497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695426 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="config-reloader" Apr 16 14:03:02.695497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695475 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="alertmanager" Apr 16 14:03:02.695497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695482 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="config-reloader" Apr 16 14:03:02.695497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695489 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="kube-rbac-proxy-web" Apr 16 14:03:02.695497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695496 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="prom-label-proxy" Apr 16 14:03:02.695497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695503 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="kube-rbac-proxy-metric" Apr 16 14:03:02.695981 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695510 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="560a89dd-7638-47e3-9703-3c403ab2ff34" containerName="console" Apr 16 14:03:02.695981 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695517 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" containerName="kube-rbac-proxy" Apr 16 14:03:02.695981 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.695524 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="31e70218-76f5-466f-8893-9b596d11423e" containerName="registry" Apr 16 14:03:02.699160 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.699145 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.701380 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.701361 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-m8gcv\"" Apr 16 14:03:02.701466 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.701401 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 14:03:02.701466 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.701426 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 14:03:02.701653 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.701639 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 14:03:02.701729 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.701698 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 14:03:02.701729 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.701713 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 14:03:02.702136 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.702118 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 14:03:02.702243 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.702140 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 14:03:02.702243 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.702159 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 14:03:02.715962 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.715936 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:03:02.717020 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.716974 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 14:03:02.757472 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.757446 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/625a9a09-ac89-4da9-992e-aef778a26bf2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.757581 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.757480 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.757581 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.757508 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.757581 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.757524 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-web-config\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.757581 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.757555 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/625a9a09-ac89-4da9-992e-aef778a26bf2-config-out\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.757581 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.757574 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l2lm\" (UniqueName: \"kubernetes.io/projected/625a9a09-ac89-4da9-992e-aef778a26bf2-kube-api-access-4l2lm\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.757780 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.757595 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.757780 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.757613 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.757780 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.757630 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.757780 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.757649 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/625a9a09-ac89-4da9-992e-aef778a26bf2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.757780 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.757737 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/625a9a09-ac89-4da9-992e-aef778a26bf2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.757780 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.757776 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-config-volume\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.757952 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.757801 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/625a9a09-ac89-4da9-992e-aef778a26bf2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.859069 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.859026 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.859069 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.859075 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.859331 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.859096 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.859331 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.859219 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/625a9a09-ac89-4da9-992e-aef778a26bf2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.859331 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.859262 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/625a9a09-ac89-4da9-992e-aef778a26bf2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.859331 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.859296 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-config-volume\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.859533 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.859337 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/625a9a09-ac89-4da9-992e-aef778a26bf2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.859533 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.859397 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/625a9a09-ac89-4da9-992e-aef778a26bf2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.859533 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.859434 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.859533 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.859467 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.859533 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.859492 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-web-config\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.859808 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.859533 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/625a9a09-ac89-4da9-992e-aef778a26bf2-config-out\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.859808 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.859571 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4l2lm\" (UniqueName: \"kubernetes.io/projected/625a9a09-ac89-4da9-992e-aef778a26bf2-kube-api-access-4l2lm\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.859904 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.859849 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/625a9a09-ac89-4da9-992e-aef778a26bf2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.860268 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.860241 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/625a9a09-ac89-4da9-992e-aef778a26bf2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.860495 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.860469 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/625a9a09-ac89-4da9-992e-aef778a26bf2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.862292 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.862166 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-config-volume\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.862292 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.862273 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/625a9a09-ac89-4da9-992e-aef778a26bf2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.862292 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.862278 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.862513 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.862283 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.862573 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.862515 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.862850 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.862829 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/625a9a09-ac89-4da9-992e-aef778a26bf2-config-out\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.862936 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.862888 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.862936 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.862914 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-web-config\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.864337 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.864321 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/625a9a09-ac89-4da9-992e-aef778a26bf2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.867795 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.867729 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l2lm\" (UniqueName: \"kubernetes.io/projected/625a9a09-ac89-4da9-992e-aef778a26bf2-kube-api-access-4l2lm\") pod \"alertmanager-main-0\" (UID: \"625a9a09-ac89-4da9-992e-aef778a26bf2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:02.939547 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:02.939469 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00213ce9-5c80-4860-9d58-545d4c389ca7" path="/var/lib/kubelet/pods/00213ce9-5c80-4860-9d58-545d4c389ca7/volumes" Apr 16 14:03:03.008868 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:03.008822 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:03:03.142072 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:03.141990 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:03:03.144704 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:03:03.144657 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod625a9a09_ac89_4da9_992e_aef778a26bf2.slice/crio-a042173c899d686dee735fe3c06a5dcc8fbf26227659aa07d038d6e4c6eeebd0 WatchSource:0}: Error finding container a042173c899d686dee735fe3c06a5dcc8fbf26227659aa07d038d6e4c6eeebd0: Status 404 returned error can't find the container with id a042173c899d686dee735fe3c06a5dcc8fbf26227659aa07d038d6e4c6eeebd0 Apr 16 14:03:03.638620 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:03.638587 2582 generic.go:358] "Generic (PLEG): container finished" podID="625a9a09-ac89-4da9-992e-aef778a26bf2" containerID="b04e027fe3cbb056f1f0e466ee4ef03ddf8d0b4c3d5d34b1e4b32496af5d549b" exitCode=0 Apr 16 14:03:03.639049 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:03.638655 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"625a9a09-ac89-4da9-992e-aef778a26bf2","Type":"ContainerDied","Data":"b04e027fe3cbb056f1f0e466ee4ef03ddf8d0b4c3d5d34b1e4b32496af5d549b"} Apr 16 14:03:03.639049 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:03.638696 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"625a9a09-ac89-4da9-992e-aef778a26bf2","Type":"ContainerStarted","Data":"a042173c899d686dee735fe3c06a5dcc8fbf26227659aa07d038d6e4c6eeebd0"} Apr 16 14:03:04.644556 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:04.644515 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"625a9a09-ac89-4da9-992e-aef778a26bf2","Type":"ContainerStarted","Data":"1d46b9cd3f4581ccde5ace4727eb98dc43ecc115ff5368aeb7a164a7bd1641d7"} Apr 16 14:03:04.644556 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:04.644559 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"625a9a09-ac89-4da9-992e-aef778a26bf2","Type":"ContainerStarted","Data":"92baeb1d436d4e59cfdce16df6b474f2b2687126ee89b2a525c0f10801937ab2"} Apr 16 14:03:04.645113 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:04.644568 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"625a9a09-ac89-4da9-992e-aef778a26bf2","Type":"ContainerStarted","Data":"0065cba7df5cb8ac7e7707aed64d14c8e19003afd30b2b410dace7005ea473df"} Apr 16 14:03:04.645113 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:04.644578 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"625a9a09-ac89-4da9-992e-aef778a26bf2","Type":"ContainerStarted","Data":"3509573e9f537cad6ae9dfddf659cc52d830e741f955fad4db43728ef348e994"} Apr 16 14:03:04.645113 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:04.644586 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"625a9a09-ac89-4da9-992e-aef778a26bf2","Type":"ContainerStarted","Data":"f1ce8f6622ead8a704f17390de2801e6d7e30fa7575d82e66dddea1e3597da4a"} Apr 16 14:03:04.645113 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:04.644595 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"625a9a09-ac89-4da9-992e-aef778a26bf2","Type":"ContainerStarted","Data":"6c838804f18d0ee0be96c8d049c4dca3d5d1b47aef9838a38c5e444f350ea7d5"} Apr 16 14:03:04.684616 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:04.684576 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.684562793 podStartE2EDuration="2.684562793s" podCreationTimestamp="2026-04-16 14:03:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:03:04.683034063 +0000 UTC m=+238.355935368" watchObservedRunningTime="2026-04-16 14:03:04.684562793 +0000 UTC m=+238.357464086" Apr 16 14:03:04.984701 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:04.984660 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5cb45775c4-bn7dz"] Apr 16 14:03:04.987197 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:04.987179 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:04.989205 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:04.989180 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 14:03:04.989297 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:04.989218 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-vctsq\"" Apr 16 14:03:04.989665 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:04.989623 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 14:03:04.989722 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:04.989692 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 14:03:04.989869 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:04.989855 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 14:03:04.989913 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:04.989881 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 14:03:04.994174 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:04.994150 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 14:03:05.003844 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.003825 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5cb45775c4-bn7dz"] Apr 16 14:03:05.076962 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.076938 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a7934c32-5099-418e-979d-b87db5205932-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.077104 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.076979 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/a7934c32-5099-418e-979d-b87db5205932-secret-telemeter-client\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.077104 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.077096 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/a7934c32-5099-418e-979d-b87db5205932-federate-client-tls\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.077231 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.077178 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7934c32-5099-418e-979d-b87db5205932-serving-certs-ca-bundle\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.077231 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.077210 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/a7934c32-5099-418e-979d-b87db5205932-telemeter-client-tls\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.077312 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.077240 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ngmx\" (UniqueName: \"kubernetes.io/projected/a7934c32-5099-418e-979d-b87db5205932-kube-api-access-6ngmx\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.077350 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.077328 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7934c32-5099-418e-979d-b87db5205932-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.077413 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.077394 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7934c32-5099-418e-979d-b87db5205932-metrics-client-ca\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.178161 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.178114 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7934c32-5099-418e-979d-b87db5205932-serving-certs-ca-bundle\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.178161 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.178169 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/a7934c32-5099-418e-979d-b87db5205932-telemeter-client-tls\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.178372 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.178201 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ngmx\" (UniqueName: \"kubernetes.io/projected/a7934c32-5099-418e-979d-b87db5205932-kube-api-access-6ngmx\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.178372 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.178231 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7934c32-5099-418e-979d-b87db5205932-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.178372 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.178260 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7934c32-5099-418e-979d-b87db5205932-metrics-client-ca\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.178372 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.178290 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a7934c32-5099-418e-979d-b87db5205932-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.178372 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.178324 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/a7934c32-5099-418e-979d-b87db5205932-secret-telemeter-client\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.178372 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.178367 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/a7934c32-5099-418e-979d-b87db5205932-federate-client-tls\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.178967 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.178942 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7934c32-5099-418e-979d-b87db5205932-serving-certs-ca-bundle\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.179097 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.178953 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7934c32-5099-418e-979d-b87db5205932-metrics-client-ca\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.179371 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.179348 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7934c32-5099-418e-979d-b87db5205932-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.180873 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.180844 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/a7934c32-5099-418e-979d-b87db5205932-secret-telemeter-client\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.180951 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.180909 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/a7934c32-5099-418e-979d-b87db5205932-federate-client-tls\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.181092 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.181074 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a7934c32-5099-418e-979d-b87db5205932-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.181190 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.181171 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/a7934c32-5099-418e-979d-b87db5205932-telemeter-client-tls\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.186405 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.186378 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ngmx\" (UniqueName: \"kubernetes.io/projected/a7934c32-5099-418e-979d-b87db5205932-kube-api-access-6ngmx\") pod \"telemeter-client-5cb45775c4-bn7dz\" (UID: \"a7934c32-5099-418e-979d-b87db5205932\") " pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.297552 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.297475 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" Apr 16 14:03:05.436656 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.436628 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5cb45775c4-bn7dz"] Apr 16 14:03:05.440178 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:03:05.440143 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7934c32_5099_418e_979d_b87db5205932.slice/crio-0a51701f2f905e1947dbb89c85a4f58e1279ad80982e067b00bc4417685b130f WatchSource:0}: Error finding container 0a51701f2f905e1947dbb89c85a4f58e1279ad80982e067b00bc4417685b130f: Status 404 returned error can't find the container with id 0a51701f2f905e1947dbb89c85a4f58e1279ad80982e067b00bc4417685b130f Apr 16 14:03:05.648655 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:05.648575 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" event={"ID":"a7934c32-5099-418e-979d-b87db5205932","Type":"ContainerStarted","Data":"0a51701f2f905e1947dbb89c85a4f58e1279ad80982e067b00bc4417685b130f"} Apr 16 14:03:07.662115 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:07.662079 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" event={"ID":"a7934c32-5099-418e-979d-b87db5205932","Type":"ContainerStarted","Data":"b2c2f74c3da891c1570975c27487ede71e1c1f070fcc05450a8f4b06c03611be"} Apr 16 14:03:07.662115 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:07.662112 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" event={"ID":"a7934c32-5099-418e-979d-b87db5205932","Type":"ContainerStarted","Data":"986a10cda63378457838a51ad25dd8b6b6e5448aa71a75c8b7c4fd15f099a697"} Apr 16 14:03:07.662115 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:07.662122 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" event={"ID":"a7934c32-5099-418e-979d-b87db5205932","Type":"ContainerStarted","Data":"8cc83acdddcf3af73ced0c538245ff6fbdcc4f58e4e9a3e33fa8d63ce4f426d9"} Apr 16 14:03:07.695114 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:07.695059 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5cb45775c4-bn7dz" podStartSLOduration=2.070952522 podStartE2EDuration="3.695043s" podCreationTimestamp="2026-04-16 14:03:04 +0000 UTC" firstStartedPulling="2026-04-16 14:03:05.442115926 +0000 UTC m=+239.115017204" lastFinishedPulling="2026-04-16 14:03:07.06620641 +0000 UTC m=+240.739107682" observedRunningTime="2026-04-16 14:03:07.694445619 +0000 UTC m=+241.367346924" watchObservedRunningTime="2026-04-16 14:03:07.695043 +0000 UTC m=+241.367944293" Apr 16 14:03:08.537813 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.537779 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-86d5499b9c-pw4j8"] Apr 16 14:03:08.540213 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.540196 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.544718 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.544701 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:03:08.544718 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.544709 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:03:08.545913 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.545896 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:03:08.546898 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.546875 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-vvc8w\"" Apr 16 14:03:08.546997 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.546960 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:03:08.547149 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.547128 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:03:08.547231 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.547132 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:03:08.547231 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.547162 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:03:08.550269 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.550252 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 14:03:08.556967 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.556938 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86d5499b9c-pw4j8"] Apr 16 14:03:08.609438 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.609414 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-oauth-serving-cert\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.609556 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.609446 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-trusted-ca-bundle\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.609556 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.609484 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-console-config\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.609556 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.609504 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-service-ca\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.609556 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.609524 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9hs4\" (UniqueName: \"kubernetes.io/projected/7ccef071-6e19-4702-abf3-840268c00e99-kube-api-access-s9hs4\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.609556 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.609552 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7ccef071-6e19-4702-abf3-840268c00e99-console-oauth-config\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.609750 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.609598 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ccef071-6e19-4702-abf3-840268c00e99-console-serving-cert\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.710273 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.710246 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-oauth-serving-cert\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.710273 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.710275 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-trusted-ca-bundle\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.710724 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.710298 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-console-config\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.710724 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.710314 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-service-ca\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.710724 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.710344 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9hs4\" (UniqueName: \"kubernetes.io/projected/7ccef071-6e19-4702-abf3-840268c00e99-kube-api-access-s9hs4\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.710724 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.710379 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7ccef071-6e19-4702-abf3-840268c00e99-console-oauth-config\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.710724 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.710507 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ccef071-6e19-4702-abf3-840268c00e99-console-serving-cert\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.711121 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.711083 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-console-config\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.711219 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.711159 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-oauth-serving-cert\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.711288 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.711264 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-service-ca\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.711528 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.711510 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-trusted-ca-bundle\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.712975 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.712954 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7ccef071-6e19-4702-abf3-840268c00e99-console-oauth-config\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.713049 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.713007 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ccef071-6e19-4702-abf3-840268c00e99-console-serving-cert\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.718048 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.718029 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9hs4\" (UniqueName: \"kubernetes.io/projected/7ccef071-6e19-4702-abf3-840268c00e99-kube-api-access-s9hs4\") pod \"console-86d5499b9c-pw4j8\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.849574 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.849523 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:08.970119 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:08.970089 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86d5499b9c-pw4j8"] Apr 16 14:03:08.972765 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:03:08.972736 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ccef071_6e19_4702_abf3_840268c00e99.slice/crio-06ba8c79e07ff85b9357c0d6cfd3e94836cbd2a14b499ec71a7f9470767e0c8d WatchSource:0}: Error finding container 06ba8c79e07ff85b9357c0d6cfd3e94836cbd2a14b499ec71a7f9470767e0c8d: Status 404 returned error can't find the container with id 06ba8c79e07ff85b9357c0d6cfd3e94836cbd2a14b499ec71a7f9470767e0c8d Apr 16 14:03:09.668843 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:09.668806 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86d5499b9c-pw4j8" event={"ID":"7ccef071-6e19-4702-abf3-840268c00e99","Type":"ContainerStarted","Data":"6dd2bfe0e7d3689ac744ecc0a7757c81d80caaeed751a1df05484dbbede197bf"} Apr 16 14:03:09.668843 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:09.668844 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86d5499b9c-pw4j8" event={"ID":"7ccef071-6e19-4702-abf3-840268c00e99","Type":"ContainerStarted","Data":"06ba8c79e07ff85b9357c0d6cfd3e94836cbd2a14b499ec71a7f9470767e0c8d"} Apr 16 14:03:09.686437 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:09.686384 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86d5499b9c-pw4j8" podStartSLOduration=1.686367567 podStartE2EDuration="1.686367567s" podCreationTimestamp="2026-04-16 14:03:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:03:09.685243494 +0000 UTC m=+243.358144781" watchObservedRunningTime="2026-04-16 14:03:09.686367567 +0000 UTC m=+243.359268864" Apr 16 14:03:18.850222 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:18.850140 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:18.850574 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:18.850438 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:18.854862 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:18.854842 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:03:19.703793 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:03:19.703764 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:04:06.856300 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:06.856272 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:04:06.856300 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:06.856281 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:04:06.862309 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:06.862292 2582 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:04:23.761586 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.761544 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74ddcfb9c7-d7drv"] Apr 16 14:04:23.764052 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.764031 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.779784 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.779759 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74ddcfb9c7-d7drv"] Apr 16 14:04:23.893986 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.893952 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-oauth-serving-cert\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.894179 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.894005 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-console-serving-cert\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.894179 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.894023 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-console-oauth-config\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.894179 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.894081 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlprs\" (UniqueName: \"kubernetes.io/projected/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-kube-api-access-dlprs\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.894179 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.894142 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-console-config\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.894179 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.894174 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-service-ca\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.894362 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.894191 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-trusted-ca-bundle\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.995410 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.995368 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-console-config\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.995571 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.995423 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-service-ca\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.995648 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.995625 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-trusted-ca-bundle\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.995769 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.995752 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-oauth-serving-cert\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.995849 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.995833 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-console-serving-cert\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.995892 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.995869 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-console-oauth-config\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.995937 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.995895 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlprs\" (UniqueName: \"kubernetes.io/projected/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-kube-api-access-dlprs\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.996276 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.996244 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-service-ca\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.996276 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.996265 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-console-config\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.996451 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.996356 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-oauth-serving-cert\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.996662 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.996645 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-trusted-ca-bundle\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.998302 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.998268 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-console-oauth-config\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:23.998406 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:23.998388 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-console-serving-cert\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:24.007079 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:24.007054 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlprs\" (UniqueName: \"kubernetes.io/projected/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-kube-api-access-dlprs\") pod \"console-74ddcfb9c7-d7drv\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:24.073644 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:24.073561 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:24.214115 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:24.214090 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74ddcfb9c7-d7drv"] Apr 16 14:04:24.216287 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:04:24.216257 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode93e66dc_e1e5_4bc0_a5a4_207a0a77567c.slice/crio-47668aa02b24d4ce5919b41b0b8f2afa0ba68ff15149e97cc9d2d61b76a41ce8 WatchSource:0}: Error finding container 47668aa02b24d4ce5919b41b0b8f2afa0ba68ff15149e97cc9d2d61b76a41ce8: Status 404 returned error can't find the container with id 47668aa02b24d4ce5919b41b0b8f2afa0ba68ff15149e97cc9d2d61b76a41ce8 Apr 16 14:04:24.218042 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:24.218026 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:04:24.882162 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:24.882125 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74ddcfb9c7-d7drv" event={"ID":"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c","Type":"ContainerStarted","Data":"facf8ac71b2988f783b769e7fa969332afb02c8f47ca70d983320f6bc0ee374e"} Apr 16 14:04:24.882162 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:24.882163 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74ddcfb9c7-d7drv" event={"ID":"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c","Type":"ContainerStarted","Data":"47668aa02b24d4ce5919b41b0b8f2afa0ba68ff15149e97cc9d2d61b76a41ce8"} Apr 16 14:04:24.900543 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:24.900502 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74ddcfb9c7-d7drv" podStartSLOduration=1.900486796 podStartE2EDuration="1.900486796s" podCreationTimestamp="2026-04-16 14:04:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:04:24.900409818 +0000 UTC m=+318.573311123" watchObservedRunningTime="2026-04-16 14:04:24.900486796 +0000 UTC m=+318.573388090" Apr 16 14:04:34.073714 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:34.073663 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:34.074085 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:34.073744 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:34.078313 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:34.078292 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:34.914593 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:34.914566 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:04:34.964833 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:04:34.964806 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86d5499b9c-pw4j8"] Apr 16 14:05:00.931823 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:00.931716 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-86d5499b9c-pw4j8" podUID="7ccef071-6e19-4702-abf3-840268c00e99" containerName="console" containerID="cri-o://6dd2bfe0e7d3689ac744ecc0a7757c81d80caaeed751a1df05484dbbede197bf" gracePeriod=15 Apr 16 14:05:01.162529 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.162505 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86d5499b9c-pw4j8_7ccef071-6e19-4702-abf3-840268c00e99/console/0.log" Apr 16 14:05:01.162638 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.162565 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:05:01.284400 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.284368 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7ccef071-6e19-4702-abf3-840268c00e99-console-oauth-config\") pod \"7ccef071-6e19-4702-abf3-840268c00e99\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " Apr 16 14:05:01.284566 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.284412 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-trusted-ca-bundle\") pod \"7ccef071-6e19-4702-abf3-840268c00e99\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " Apr 16 14:05:01.284566 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.284448 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9hs4\" (UniqueName: \"kubernetes.io/projected/7ccef071-6e19-4702-abf3-840268c00e99-kube-api-access-s9hs4\") pod \"7ccef071-6e19-4702-abf3-840268c00e99\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " Apr 16 14:05:01.284566 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.284467 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-service-ca\") pod \"7ccef071-6e19-4702-abf3-840268c00e99\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " Apr 16 14:05:01.284566 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.284485 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-oauth-serving-cert\") pod \"7ccef071-6e19-4702-abf3-840268c00e99\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " Apr 16 14:05:01.284566 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.284517 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ccef071-6e19-4702-abf3-840268c00e99-console-serving-cert\") pod \"7ccef071-6e19-4702-abf3-840268c00e99\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " Apr 16 14:05:01.284566 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.284549 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-console-config\") pod \"7ccef071-6e19-4702-abf3-840268c00e99\" (UID: \"7ccef071-6e19-4702-abf3-840268c00e99\") " Apr 16 14:05:01.285028 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.284998 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-service-ca" (OuterVolumeSpecName: "service-ca") pod "7ccef071-6e19-4702-abf3-840268c00e99" (UID: "7ccef071-6e19-4702-abf3-840268c00e99"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:05:01.285149 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.285044 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7ccef071-6e19-4702-abf3-840268c00e99" (UID: "7ccef071-6e19-4702-abf3-840268c00e99"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:05:01.285149 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.285066 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-console-config" (OuterVolumeSpecName: "console-config") pod "7ccef071-6e19-4702-abf3-840268c00e99" (UID: "7ccef071-6e19-4702-abf3-840268c00e99"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:05:01.285149 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.285104 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7ccef071-6e19-4702-abf3-840268c00e99" (UID: "7ccef071-6e19-4702-abf3-840268c00e99"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:05:01.286552 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.286528 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ccef071-6e19-4702-abf3-840268c00e99-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7ccef071-6e19-4702-abf3-840268c00e99" (UID: "7ccef071-6e19-4702-abf3-840268c00e99"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:05:01.287043 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.287023 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ccef071-6e19-4702-abf3-840268c00e99-kube-api-access-s9hs4" (OuterVolumeSpecName: "kube-api-access-s9hs4") pod "7ccef071-6e19-4702-abf3-840268c00e99" (UID: "7ccef071-6e19-4702-abf3-840268c00e99"). InnerVolumeSpecName "kube-api-access-s9hs4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:05:01.287112 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.287056 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ccef071-6e19-4702-abf3-840268c00e99-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7ccef071-6e19-4702-abf3-840268c00e99" (UID: "7ccef071-6e19-4702-abf3-840268c00e99"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:05:01.385469 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.385432 2582 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7ccef071-6e19-4702-abf3-840268c00e99-console-oauth-config\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:05:01.385469 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.385462 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-trusted-ca-bundle\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:05:01.385469 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.385476 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s9hs4\" (UniqueName: \"kubernetes.io/projected/7ccef071-6e19-4702-abf3-840268c00e99-kube-api-access-s9hs4\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:05:01.385735 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.385490 2582 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-service-ca\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:05:01.385735 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.385502 2582 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-oauth-serving-cert\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:05:01.385735 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.385514 2582 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ccef071-6e19-4702-abf3-840268c00e99-console-serving-cert\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:05:01.385735 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.385525 2582 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7ccef071-6e19-4702-abf3-840268c00e99-console-config\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:05:01.988467 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.988440 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86d5499b9c-pw4j8_7ccef071-6e19-4702-abf3-840268c00e99/console/0.log" Apr 16 14:05:01.988940 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.988478 2582 generic.go:358] "Generic (PLEG): container finished" podID="7ccef071-6e19-4702-abf3-840268c00e99" containerID="6dd2bfe0e7d3689ac744ecc0a7757c81d80caaeed751a1df05484dbbede197bf" exitCode=2 Apr 16 14:05:01.988940 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.988539 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86d5499b9c-pw4j8" Apr 16 14:05:01.988940 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.988579 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86d5499b9c-pw4j8" event={"ID":"7ccef071-6e19-4702-abf3-840268c00e99","Type":"ContainerDied","Data":"6dd2bfe0e7d3689ac744ecc0a7757c81d80caaeed751a1df05484dbbede197bf"} Apr 16 14:05:01.988940 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.988628 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86d5499b9c-pw4j8" event={"ID":"7ccef071-6e19-4702-abf3-840268c00e99","Type":"ContainerDied","Data":"06ba8c79e07ff85b9357c0d6cfd3e94836cbd2a14b499ec71a7f9470767e0c8d"} Apr 16 14:05:01.988940 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.988651 2582 scope.go:117] "RemoveContainer" containerID="6dd2bfe0e7d3689ac744ecc0a7757c81d80caaeed751a1df05484dbbede197bf" Apr 16 14:05:01.997093 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.997077 2582 scope.go:117] "RemoveContainer" containerID="6dd2bfe0e7d3689ac744ecc0a7757c81d80caaeed751a1df05484dbbede197bf" Apr 16 14:05:01.997340 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:05:01.997321 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dd2bfe0e7d3689ac744ecc0a7757c81d80caaeed751a1df05484dbbede197bf\": container with ID starting with 6dd2bfe0e7d3689ac744ecc0a7757c81d80caaeed751a1df05484dbbede197bf not found: ID does not exist" containerID="6dd2bfe0e7d3689ac744ecc0a7757c81d80caaeed751a1df05484dbbede197bf" Apr 16 14:05:01.997400 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:01.997347 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dd2bfe0e7d3689ac744ecc0a7757c81d80caaeed751a1df05484dbbede197bf"} err="failed to get container status \"6dd2bfe0e7d3689ac744ecc0a7757c81d80caaeed751a1df05484dbbede197bf\": rpc error: code = NotFound desc = could not find container \"6dd2bfe0e7d3689ac744ecc0a7757c81d80caaeed751a1df05484dbbede197bf\": container with ID starting with 6dd2bfe0e7d3689ac744ecc0a7757c81d80caaeed751a1df05484dbbede197bf not found: ID does not exist" Apr 16 14:05:02.008518 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:02.008495 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86d5499b9c-pw4j8"] Apr 16 14:05:02.014664 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:02.014645 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-86d5499b9c-pw4j8"] Apr 16 14:05:02.938513 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:02.938479 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ccef071-6e19-4702-abf3-840268c00e99" path="/var/lib/kubelet/pods/7ccef071-6e19-4702-abf3-840268c00e99/volumes" Apr 16 14:05:28.631178 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.631144 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c877dfbb9-6s55n"] Apr 16 14:05:28.631530 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.631473 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ccef071-6e19-4702-abf3-840268c00e99" containerName="console" Apr 16 14:05:28.631530 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.631484 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ccef071-6e19-4702-abf3-840268c00e99" containerName="console" Apr 16 14:05:28.631602 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.631541 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ccef071-6e19-4702-abf3-840268c00e99" containerName="console" Apr 16 14:05:28.633264 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.633247 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c877dfbb9-6s55n" Apr 16 14:05:28.636004 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.635980 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-rzkfc\"" Apr 16 14:05:28.636155 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.636010 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 14:05:28.636155 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.636067 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 14:05:28.636297 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.636164 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 14:05:28.636297 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.636267 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 14:05:28.644224 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.644203 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c877dfbb9-6s55n"] Apr 16 14:05:28.684725 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.684702 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c"] Apr 16 14:05:28.686975 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.686958 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c" Apr 16 14:05:28.690957 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.690938 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 14:05:28.702647 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.702629 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c"] Apr 16 14:05:28.704050 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.704031 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7"] Apr 16 14:05:28.706454 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.706441 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.708244 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.708228 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 14:05:28.708323 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.708306 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 14:05:28.708536 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.708517 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 14:05:28.708866 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.708847 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 14:05:28.711415 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.711383 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tfzv\" (UniqueName: \"kubernetes.io/projected/ae0a65ba-581b-4bae-89c9-dd3855b82890-kube-api-access-4tfzv\") pod \"managed-serviceaccount-addon-agent-5c877dfbb9-6s55n\" (UID: \"ae0a65ba-581b-4bae-89c9-dd3855b82890\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c877dfbb9-6s55n" Apr 16 14:05:28.711498 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.711458 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ae0a65ba-581b-4bae-89c9-dd3855b82890-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5c877dfbb9-6s55n\" (UID: \"ae0a65ba-581b-4bae-89c9-dd3855b82890\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c877dfbb9-6s55n" Apr 16 14:05:28.716711 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.716674 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7"] Apr 16 14:05:28.812459 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.812423 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ae0a65ba-581b-4bae-89c9-dd3855b82890-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5c877dfbb9-6s55n\" (UID: \"ae0a65ba-581b-4bae-89c9-dd3855b82890\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c877dfbb9-6s55n" Apr 16 14:05:28.812597 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.812509 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb55t\" (UniqueName: \"kubernetes.io/projected/4983e76c-9a13-4792-9694-7478eef95d00-kube-api-access-hb55t\") pod \"klusterlet-addon-workmgr-bc976cb56-8hq9c\" (UID: \"4983e76c-9a13-4792-9694-7478eef95d00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c" Apr 16 14:05:28.812597 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.812555 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3161aa2c-8dd0-44f2-98a8-7bd920509ba0-ca\") pod \"cluster-proxy-proxy-agent-795665fd-c9lq7\" (UID: \"3161aa2c-8dd0-44f2-98a8-7bd920509ba0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.812597 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.812593 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3161aa2c-8dd0-44f2-98a8-7bd920509ba0-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-795665fd-c9lq7\" (UID: \"3161aa2c-8dd0-44f2-98a8-7bd920509ba0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.812719 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.812617 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngvr9\" (UniqueName: \"kubernetes.io/projected/3161aa2c-8dd0-44f2-98a8-7bd920509ba0-kube-api-access-ngvr9\") pod \"cluster-proxy-proxy-agent-795665fd-c9lq7\" (UID: \"3161aa2c-8dd0-44f2-98a8-7bd920509ba0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.812719 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.812655 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4983e76c-9a13-4792-9694-7478eef95d00-tmp\") pod \"klusterlet-addon-workmgr-bc976cb56-8hq9c\" (UID: \"4983e76c-9a13-4792-9694-7478eef95d00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c" Apr 16 14:05:28.812719 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.812677 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3161aa2c-8dd0-44f2-98a8-7bd920509ba0-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-795665fd-c9lq7\" (UID: \"3161aa2c-8dd0-44f2-98a8-7bd920509ba0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.812817 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.812760 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4tfzv\" (UniqueName: \"kubernetes.io/projected/ae0a65ba-581b-4bae-89c9-dd3855b82890-kube-api-access-4tfzv\") pod \"managed-serviceaccount-addon-agent-5c877dfbb9-6s55n\" (UID: \"ae0a65ba-581b-4bae-89c9-dd3855b82890\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c877dfbb9-6s55n" Apr 16 14:05:28.812817 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.812789 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3161aa2c-8dd0-44f2-98a8-7bd920509ba0-hub\") pod \"cluster-proxy-proxy-agent-795665fd-c9lq7\" (UID: \"3161aa2c-8dd0-44f2-98a8-7bd920509ba0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.812817 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.812804 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3161aa2c-8dd0-44f2-98a8-7bd920509ba0-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-795665fd-c9lq7\" (UID: \"3161aa2c-8dd0-44f2-98a8-7bd920509ba0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.812991 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.812830 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/4983e76c-9a13-4792-9694-7478eef95d00-klusterlet-config\") pod \"klusterlet-addon-workmgr-bc976cb56-8hq9c\" (UID: \"4983e76c-9a13-4792-9694-7478eef95d00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c" Apr 16 14:05:28.814872 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.814854 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ae0a65ba-581b-4bae-89c9-dd3855b82890-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5c877dfbb9-6s55n\" (UID: \"ae0a65ba-581b-4bae-89c9-dd3855b82890\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c877dfbb9-6s55n" Apr 16 14:05:28.820207 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.820187 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tfzv\" (UniqueName: \"kubernetes.io/projected/ae0a65ba-581b-4bae-89c9-dd3855b82890-kube-api-access-4tfzv\") pod \"managed-serviceaccount-addon-agent-5c877dfbb9-6s55n\" (UID: \"ae0a65ba-581b-4bae-89c9-dd3855b82890\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c877dfbb9-6s55n" Apr 16 14:05:28.913353 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.913293 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3161aa2c-8dd0-44f2-98a8-7bd920509ba0-ca\") pod \"cluster-proxy-proxy-agent-795665fd-c9lq7\" (UID: \"3161aa2c-8dd0-44f2-98a8-7bd920509ba0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.913353 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.913320 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3161aa2c-8dd0-44f2-98a8-7bd920509ba0-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-795665fd-c9lq7\" (UID: \"3161aa2c-8dd0-44f2-98a8-7bd920509ba0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.913353 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.913338 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngvr9\" (UniqueName: \"kubernetes.io/projected/3161aa2c-8dd0-44f2-98a8-7bd920509ba0-kube-api-access-ngvr9\") pod \"cluster-proxy-proxy-agent-795665fd-c9lq7\" (UID: \"3161aa2c-8dd0-44f2-98a8-7bd920509ba0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.913565 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.913359 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4983e76c-9a13-4792-9694-7478eef95d00-tmp\") pod \"klusterlet-addon-workmgr-bc976cb56-8hq9c\" (UID: \"4983e76c-9a13-4792-9694-7478eef95d00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c" Apr 16 14:05:28.913565 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.913382 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3161aa2c-8dd0-44f2-98a8-7bd920509ba0-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-795665fd-c9lq7\" (UID: \"3161aa2c-8dd0-44f2-98a8-7bd920509ba0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.913565 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.913435 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3161aa2c-8dd0-44f2-98a8-7bd920509ba0-hub\") pod \"cluster-proxy-proxy-agent-795665fd-c9lq7\" (UID: \"3161aa2c-8dd0-44f2-98a8-7bd920509ba0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.913565 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.913460 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3161aa2c-8dd0-44f2-98a8-7bd920509ba0-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-795665fd-c9lq7\" (UID: \"3161aa2c-8dd0-44f2-98a8-7bd920509ba0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.913565 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.913491 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/4983e76c-9a13-4792-9694-7478eef95d00-klusterlet-config\") pod \"klusterlet-addon-workmgr-bc976cb56-8hq9c\" (UID: \"4983e76c-9a13-4792-9694-7478eef95d00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c" Apr 16 14:05:28.913565 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.913554 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hb55t\" (UniqueName: \"kubernetes.io/projected/4983e76c-9a13-4792-9694-7478eef95d00-kube-api-access-hb55t\") pod \"klusterlet-addon-workmgr-bc976cb56-8hq9c\" (UID: \"4983e76c-9a13-4792-9694-7478eef95d00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c" Apr 16 14:05:28.913908 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.913793 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4983e76c-9a13-4792-9694-7478eef95d00-tmp\") pod \"klusterlet-addon-workmgr-bc976cb56-8hq9c\" (UID: \"4983e76c-9a13-4792-9694-7478eef95d00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c" Apr 16 14:05:28.914299 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.914271 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3161aa2c-8dd0-44f2-98a8-7bd920509ba0-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-795665fd-c9lq7\" (UID: \"3161aa2c-8dd0-44f2-98a8-7bd920509ba0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.915869 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.915844 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3161aa2c-8dd0-44f2-98a8-7bd920509ba0-ca\") pod \"cluster-proxy-proxy-agent-795665fd-c9lq7\" (UID: \"3161aa2c-8dd0-44f2-98a8-7bd920509ba0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.916004 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.915901 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3161aa2c-8dd0-44f2-98a8-7bd920509ba0-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-795665fd-c9lq7\" (UID: \"3161aa2c-8dd0-44f2-98a8-7bd920509ba0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.916004 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.915991 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3161aa2c-8dd0-44f2-98a8-7bd920509ba0-hub\") pod \"cluster-proxy-proxy-agent-795665fd-c9lq7\" (UID: \"3161aa2c-8dd0-44f2-98a8-7bd920509ba0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.916188 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.916171 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3161aa2c-8dd0-44f2-98a8-7bd920509ba0-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-795665fd-c9lq7\" (UID: \"3161aa2c-8dd0-44f2-98a8-7bd920509ba0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.916301 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.916284 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/4983e76c-9a13-4792-9694-7478eef95d00-klusterlet-config\") pod \"klusterlet-addon-workmgr-bc976cb56-8hq9c\" (UID: \"4983e76c-9a13-4792-9694-7478eef95d00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c" Apr 16 14:05:28.922050 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.922028 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngvr9\" (UniqueName: \"kubernetes.io/projected/3161aa2c-8dd0-44f2-98a8-7bd920509ba0-kube-api-access-ngvr9\") pod \"cluster-proxy-proxy-agent-795665fd-c9lq7\" (UID: \"3161aa2c-8dd0-44f2-98a8-7bd920509ba0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:28.923173 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.923146 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb55t\" (UniqueName: \"kubernetes.io/projected/4983e76c-9a13-4792-9694-7478eef95d00-kube-api-access-hb55t\") pod \"klusterlet-addon-workmgr-bc976cb56-8hq9c\" (UID: \"4983e76c-9a13-4792-9694-7478eef95d00\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c" Apr 16 14:05:28.955068 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.955045 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c877dfbb9-6s55n" Apr 16 14:05:28.995743 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:28.995701 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c" Apr 16 14:05:29.014798 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:29.014548 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" Apr 16 14:05:29.081732 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:29.081708 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c877dfbb9-6s55n"] Apr 16 14:05:29.132850 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:29.132828 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c"] Apr 16 14:05:29.135451 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:05:29.135425 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4983e76c_9a13_4792_9694_7478eef95d00.slice/crio-b349a18e37aba843dc9bd9e41cb5d32e6c58ff1b5bec82219d4b7f9bf8f87130 WatchSource:0}: Error finding container b349a18e37aba843dc9bd9e41cb5d32e6c58ff1b5bec82219d4b7f9bf8f87130: Status 404 returned error can't find the container with id b349a18e37aba843dc9bd9e41cb5d32e6c58ff1b5bec82219d4b7f9bf8f87130 Apr 16 14:05:29.152485 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:29.152464 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7"] Apr 16 14:05:29.154565 ip-10-0-128-29 kubenswrapper[2582]: W0416 14:05:29.154538 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3161aa2c_8dd0_44f2_98a8_7bd920509ba0.slice/crio-3b991a66b99276ebda6d63a64214a75fc6a91e0dce27c70ce60831518f55cef9 WatchSource:0}: Error finding container 3b991a66b99276ebda6d63a64214a75fc6a91e0dce27c70ce60831518f55cef9: Status 404 returned error can't find the container with id 3b991a66b99276ebda6d63a64214a75fc6a91e0dce27c70ce60831518f55cef9 Apr 16 14:05:30.076899 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:30.076823 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" event={"ID":"3161aa2c-8dd0-44f2-98a8-7bd920509ba0","Type":"ContainerStarted","Data":"3b991a66b99276ebda6d63a64214a75fc6a91e0dce27c70ce60831518f55cef9"} Apr 16 14:05:30.080032 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:30.079974 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c877dfbb9-6s55n" event={"ID":"ae0a65ba-581b-4bae-89c9-dd3855b82890","Type":"ContainerStarted","Data":"bd8c5475c61fe3ecb037b8349d5d0c23f30e82459829b23e4ec3f95b47f5aa69"} Apr 16 14:05:30.081944 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:30.081902 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c" event={"ID":"4983e76c-9a13-4792-9694-7478eef95d00","Type":"ContainerStarted","Data":"b349a18e37aba843dc9bd9e41cb5d32e6c58ff1b5bec82219d4b7f9bf8f87130"} Apr 16 14:05:34.096587 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:34.096544 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c877dfbb9-6s55n" event={"ID":"ae0a65ba-581b-4bae-89c9-dd3855b82890","Type":"ContainerStarted","Data":"026c396984921cafeb23b6c97caf64ea624d18ee87eed5ab4ff53820fc27f6a1"} Apr 16 14:05:34.097986 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:34.097956 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c" event={"ID":"4983e76c-9a13-4792-9694-7478eef95d00","Type":"ContainerStarted","Data":"adce06ead51edc3b631315a69a8e5930ba7c3934313ee2cf0dd20c6c70edf1d5"} Apr 16 14:05:34.098143 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:34.098115 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c" Apr 16 14:05:34.099324 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:34.099300 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" event={"ID":"3161aa2c-8dd0-44f2-98a8-7bd920509ba0","Type":"ContainerStarted","Data":"8f88ed5812f04b7546a3c39455ee11f62a056b6c873a43a864303d709ca1c176"} Apr 16 14:05:34.099751 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:34.099731 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c" Apr 16 14:05:34.114260 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:34.114224 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c877dfbb9-6s55n" podStartSLOduration=1.301080183 podStartE2EDuration="6.114212192s" podCreationTimestamp="2026-04-16 14:05:28 +0000 UTC" firstStartedPulling="2026-04-16 14:05:29.087531604 +0000 UTC m=+382.760432879" lastFinishedPulling="2026-04-16 14:05:33.900663602 +0000 UTC m=+387.573564888" observedRunningTime="2026-04-16 14:05:34.112225401 +0000 UTC m=+387.785126705" watchObservedRunningTime="2026-04-16 14:05:34.114212192 +0000 UTC m=+387.787113485" Apr 16 14:05:34.130342 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:34.130304 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bc976cb56-8hq9c" podStartSLOduration=1.34872492 podStartE2EDuration="6.130292298s" podCreationTimestamp="2026-04-16 14:05:28 +0000 UTC" firstStartedPulling="2026-04-16 14:05:29.137134736 +0000 UTC m=+382.810036008" lastFinishedPulling="2026-04-16 14:05:33.918702101 +0000 UTC m=+387.591603386" observedRunningTime="2026-04-16 14:05:34.128111649 +0000 UTC m=+387.801012944" watchObservedRunningTime="2026-04-16 14:05:34.130292298 +0000 UTC m=+387.803193592" Apr 16 14:05:38.114213 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:38.114172 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" event={"ID":"3161aa2c-8dd0-44f2-98a8-7bd920509ba0","Type":"ContainerStarted","Data":"63f5f0ad0d2213bc6471127e2053bcf53addb7ac816bff87739f4491f6790f52"} Apr 16 14:05:38.114213 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:38.114208 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" event={"ID":"3161aa2c-8dd0-44f2-98a8-7bd920509ba0","Type":"ContainerStarted","Data":"b5f08d8f6177251477a5c1d7a7a69de6802c1037fff122e52b5e573e74485f53"} Apr 16 14:05:38.131969 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:05:38.131926 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-795665fd-c9lq7" podStartSLOduration=2.001474103 podStartE2EDuration="10.131912647s" podCreationTimestamp="2026-04-16 14:05:28 +0000 UTC" firstStartedPulling="2026-04-16 14:05:29.156130521 +0000 UTC m=+382.829031793" lastFinishedPulling="2026-04-16 14:05:37.286569062 +0000 UTC m=+390.959470337" observedRunningTime="2026-04-16 14:05:38.130567482 +0000 UTC m=+391.803468775" watchObservedRunningTime="2026-04-16 14:05:38.131912647 +0000 UTC m=+391.804813941" Apr 16 14:08:19.976227 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:19.976197 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74ddcfb9c7-d7drv"] Apr 16 14:08:44.995132 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:44.995068 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74ddcfb9c7-d7drv" podUID="e93e66dc-e1e5-4bc0-a5a4-207a0a77567c" containerName="console" containerID="cri-o://facf8ac71b2988f783b769e7fa969332afb02c8f47ca70d983320f6bc0ee374e" gracePeriod=15 Apr 16 14:08:45.231756 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.231735 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74ddcfb9c7-d7drv_e93e66dc-e1e5-4bc0-a5a4-207a0a77567c/console/0.log" Apr 16 14:08:45.231870 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.231794 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:08:45.392631 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.392555 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlprs\" (UniqueName: \"kubernetes.io/projected/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-kube-api-access-dlprs\") pod \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " Apr 16 14:08:45.392631 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.392589 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-console-serving-cert\") pod \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " Apr 16 14:08:45.392895 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.392647 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-console-oauth-config\") pod \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " Apr 16 14:08:45.392895 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.392678 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-console-config\") pod \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " Apr 16 14:08:45.392895 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.392740 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-service-ca\") pod \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " Apr 16 14:08:45.392895 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.392764 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-trusted-ca-bundle\") pod \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " Apr 16 14:08:45.392895 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.392806 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-oauth-serving-cert\") pod \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\" (UID: \"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c\") " Apr 16 14:08:45.393166 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.393139 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-service-ca" (OuterVolumeSpecName: "service-ca") pod "e93e66dc-e1e5-4bc0-a5a4-207a0a77567c" (UID: "e93e66dc-e1e5-4bc0-a5a4-207a0a77567c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:08:45.393234 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.393146 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-console-config" (OuterVolumeSpecName: "console-config") pod "e93e66dc-e1e5-4bc0-a5a4-207a0a77567c" (UID: "e93e66dc-e1e5-4bc0-a5a4-207a0a77567c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:08:45.393347 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.393188 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e93e66dc-e1e5-4bc0-a5a4-207a0a77567c" (UID: "e93e66dc-e1e5-4bc0-a5a4-207a0a77567c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:08:45.393406 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.393357 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e93e66dc-e1e5-4bc0-a5a4-207a0a77567c" (UID: "e93e66dc-e1e5-4bc0-a5a4-207a0a77567c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:08:45.394886 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.394856 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-kube-api-access-dlprs" (OuterVolumeSpecName: "kube-api-access-dlprs") pod "e93e66dc-e1e5-4bc0-a5a4-207a0a77567c" (UID: "e93e66dc-e1e5-4bc0-a5a4-207a0a77567c"). InnerVolumeSpecName "kube-api-access-dlprs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:08:45.394886 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.394863 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e93e66dc-e1e5-4bc0-a5a4-207a0a77567c" (UID: "e93e66dc-e1e5-4bc0-a5a4-207a0a77567c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:08:45.395012 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.394892 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e93e66dc-e1e5-4bc0-a5a4-207a0a77567c" (UID: "e93e66dc-e1e5-4bc0-a5a4-207a0a77567c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:08:45.493474 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.493448 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-trusted-ca-bundle\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:08:45.493474 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.493470 2582 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-oauth-serving-cert\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:08:45.493613 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.493480 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dlprs\" (UniqueName: \"kubernetes.io/projected/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-kube-api-access-dlprs\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:08:45.493613 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.493489 2582 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-console-serving-cert\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:08:45.493613 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.493498 2582 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-console-oauth-config\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:08:45.493613 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.493508 2582 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-console-config\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:08:45.493613 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.493519 2582 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c-service-ca\") on node \"ip-10-0-128-29.ec2.internal\" DevicePath \"\"" Apr 16 14:08:45.665512 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.665459 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74ddcfb9c7-d7drv_e93e66dc-e1e5-4bc0-a5a4-207a0a77567c/console/0.log" Apr 16 14:08:45.665512 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.665492 2582 generic.go:358] "Generic (PLEG): container finished" podID="e93e66dc-e1e5-4bc0-a5a4-207a0a77567c" containerID="facf8ac71b2988f783b769e7fa969332afb02c8f47ca70d983320f6bc0ee374e" exitCode=2 Apr 16 14:08:45.665655 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.665555 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74ddcfb9c7-d7drv" Apr 16 14:08:45.665655 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.665580 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74ddcfb9c7-d7drv" event={"ID":"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c","Type":"ContainerDied","Data":"facf8ac71b2988f783b769e7fa969332afb02c8f47ca70d983320f6bc0ee374e"} Apr 16 14:08:45.665655 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.665617 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74ddcfb9c7-d7drv" event={"ID":"e93e66dc-e1e5-4bc0-a5a4-207a0a77567c","Type":"ContainerDied","Data":"47668aa02b24d4ce5919b41b0b8f2afa0ba68ff15149e97cc9d2d61b76a41ce8"} Apr 16 14:08:45.665655 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.665633 2582 scope.go:117] "RemoveContainer" containerID="facf8ac71b2988f783b769e7fa969332afb02c8f47ca70d983320f6bc0ee374e" Apr 16 14:08:45.674092 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.674070 2582 scope.go:117] "RemoveContainer" containerID="facf8ac71b2988f783b769e7fa969332afb02c8f47ca70d983320f6bc0ee374e" Apr 16 14:08:45.674343 ip-10-0-128-29 kubenswrapper[2582]: E0416 14:08:45.674325 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"facf8ac71b2988f783b769e7fa969332afb02c8f47ca70d983320f6bc0ee374e\": container with ID starting with facf8ac71b2988f783b769e7fa969332afb02c8f47ca70d983320f6bc0ee374e not found: ID does not exist" containerID="facf8ac71b2988f783b769e7fa969332afb02c8f47ca70d983320f6bc0ee374e" Apr 16 14:08:45.674391 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.674351 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"facf8ac71b2988f783b769e7fa969332afb02c8f47ca70d983320f6bc0ee374e"} err="failed to get container status \"facf8ac71b2988f783b769e7fa969332afb02c8f47ca70d983320f6bc0ee374e\": rpc error: code = NotFound desc = could not find container \"facf8ac71b2988f783b769e7fa969332afb02c8f47ca70d983320f6bc0ee374e\": container with ID starting with facf8ac71b2988f783b769e7fa969332afb02c8f47ca70d983320f6bc0ee374e not found: ID does not exist" Apr 16 14:08:45.684438 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.684418 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74ddcfb9c7-d7drv"] Apr 16 14:08:45.688547 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:45.688524 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74ddcfb9c7-d7drv"] Apr 16 14:08:46.939497 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:08:46.939466 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e93e66dc-e1e5-4bc0-a5a4-207a0a77567c" path="/var/lib/kubelet/pods/e93e66dc-e1e5-4bc0-a5a4-207a0a77567c/volumes" Apr 16 14:09:06.878868 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:09:06.878842 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:09:06.880136 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:09:06.880113 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:14:06.903494 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:14:06.903407 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:14:06.905735 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:14:06.905709 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:19:06.927046 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:19:06.927019 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:19:06.927858 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:19:06.927829 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:24:06.949234 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:24:06.949202 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:24:06.951622 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:24:06.951601 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:29:06.970753 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:29:06.970720 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:29:06.973452 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:29:06.973429 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:34:06.995084 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:34:06.995046 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:34:06.999415 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:34:06.999367 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:39:07.020672 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:39:07.020642 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:39:07.025274 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:39:07.025253 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:44:07.041598 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:44:07.041570 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:44:07.051528 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:44:07.051506 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:49:07.065066 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:49:07.064951 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:49:07.075511 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:49:07.075486 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:54:07.086231 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:54:07.086125 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:54:07.096627 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:54:07.096610 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:59:07.107463 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:59:07.107353 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 14:59:07.119578 ip-10-0-128-29 kubenswrapper[2582]: I0416 14:59:07.119551 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 15:04:07.129836 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:07.129720 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 15:04:07.142804 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:07.142781 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 15:04:17.610749 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:17.610719 2582 ???:1] "http: TLS handshake error from 10.0.142.16:45224: EOF" Apr 16 15:04:17.618277 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:17.618253 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hxjfz_80f4a0f5-8232-4155-a115-e7470360cc63/global-pull-secret-syncer/0.log" Apr 16 15:04:17.811826 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:17.811792 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-flg9b_083b5510-48e0-4313-a2e7-fca5271e9e0f/konnectivity-agent/0.log" Apr 16 15:04:17.871077 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:17.871003 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-29.ec2.internal_80eaaec527b04d922efdac38ef2d0c20/haproxy/0.log" Apr 16 15:04:21.601934 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:21.601904 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_625a9a09-ac89-4da9-992e-aef778a26bf2/alertmanager/0.log" Apr 16 15:04:21.626132 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:21.626110 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_625a9a09-ac89-4da9-992e-aef778a26bf2/config-reloader/0.log" Apr 16 15:04:21.653051 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:21.653025 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_625a9a09-ac89-4da9-992e-aef778a26bf2/kube-rbac-proxy-web/0.log" Apr 16 15:04:21.678595 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:21.678570 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_625a9a09-ac89-4da9-992e-aef778a26bf2/kube-rbac-proxy/0.log" Apr 16 15:04:21.702475 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:21.702455 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_625a9a09-ac89-4da9-992e-aef778a26bf2/kube-rbac-proxy-metric/0.log" Apr 16 15:04:21.730220 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:21.730196 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_625a9a09-ac89-4da9-992e-aef778a26bf2/prom-label-proxy/0.log" Apr 16 15:04:21.756366 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:21.756346 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_625a9a09-ac89-4da9-992e-aef778a26bf2/init-config-reloader/0.log" Apr 16 15:04:21.905368 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:21.905313 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-84c54b594d-j8qcd_61818717-39e9-433d-91cd-4f4e4264af2c/metrics-server/0.log" Apr 16 15:04:21.934549 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:21.934522 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-sn5zt_449fa6c3-c8bd-4782-8e51-3417426d364f/monitoring-plugin/0.log" Apr 16 15:04:21.966025 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:21.966002 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-42bkv_c452e916-8621-4d4c-aee8-8bf9764fa860/node-exporter/0.log" Apr 16 15:04:21.989511 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:21.989488 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-42bkv_c452e916-8621-4d4c-aee8-8bf9764fa860/kube-rbac-proxy/0.log" Apr 16 15:04:22.018565 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:22.018544 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-42bkv_c452e916-8621-4d4c-aee8-8bf9764fa860/init-textfile/0.log" Apr 16 15:04:22.202738 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:22.202665 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-nzjwm_2bb6e164-2860-4f91-8060-da98bfd9c9be/kube-rbac-proxy-main/0.log" Apr 16 15:04:22.227101 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:22.227079 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-nzjwm_2bb6e164-2860-4f91-8060-da98bfd9c9be/kube-rbac-proxy-self/0.log" Apr 16 15:04:22.251566 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:22.251546 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-nzjwm_2bb6e164-2860-4f91-8060-da98bfd9c9be/openshift-state-metrics/0.log" Apr 16 15:04:22.525619 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:22.525577 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-4bp2n_e8c815b4-e7d9-4b96-a516-7a00cc1a2578/prometheus-operator-admission-webhook/0.log" Apr 16 15:04:22.566742 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:22.566713 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5cb45775c4-bn7dz_a7934c32-5099-418e-979d-b87db5205932/telemeter-client/0.log" Apr 16 15:04:22.601043 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:22.601017 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5cb45775c4-bn7dz_a7934c32-5099-418e-979d-b87db5205932/reload/0.log" Apr 16 15:04:22.628754 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:22.628729 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5cb45775c4-bn7dz_a7934c32-5099-418e-979d-b87db5205932/kube-rbac-proxy/0.log" Apr 16 15:04:22.668852 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:22.668833 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7759bb9fbf-bph6s_0d55217e-e9ef-473d-9e26-0468e457f308/thanos-query/0.log" Apr 16 15:04:22.701066 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:22.701019 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7759bb9fbf-bph6s_0d55217e-e9ef-473d-9e26-0468e457f308/kube-rbac-proxy-web/0.log" Apr 16 15:04:22.727551 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:22.727535 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7759bb9fbf-bph6s_0d55217e-e9ef-473d-9e26-0468e457f308/kube-rbac-proxy/0.log" Apr 16 15:04:22.760982 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:22.760964 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7759bb9fbf-bph6s_0d55217e-e9ef-473d-9e26-0468e457f308/prom-label-proxy/0.log" Apr 16 15:04:22.793482 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:22.793436 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7759bb9fbf-bph6s_0d55217e-e9ef-473d-9e26-0468e457f308/kube-rbac-proxy-rules/0.log" Apr 16 15:04:22.821887 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:22.821858 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7759bb9fbf-bph6s_0d55217e-e9ef-473d-9e26-0468e457f308/kube-rbac-proxy-metrics/0.log" Apr 16 15:04:25.096237 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.096202 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg"] Apr 16 15:04:25.096746 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.096707 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e93e66dc-e1e5-4bc0-a5a4-207a0a77567c" containerName="console" Apr 16 15:04:25.096746 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.096728 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93e66dc-e1e5-4bc0-a5a4-207a0a77567c" containerName="console" Apr 16 15:04:25.096866 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.096822 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="e93e66dc-e1e5-4bc0-a5a4-207a0a77567c" containerName="console" Apr 16 15:04:25.099915 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.099895 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:25.101889 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.101870 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g9h4j\"/\"openshift-service-ca.crt\"" Apr 16 15:04:25.101981 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.101927 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-g9h4j\"/\"default-dockercfg-df925\"" Apr 16 15:04:25.102312 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.102299 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g9h4j\"/\"kube-root-ca.crt\"" Apr 16 15:04:25.105870 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.105764 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2b28422d-4d2f-44e3-ae49-62eab597e089-proc\") pod \"perf-node-gather-daemonset-dl4mg\" (UID: \"2b28422d-4d2f-44e3-ae49-62eab597e089\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:25.105870 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.105815 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2b28422d-4d2f-44e3-ae49-62eab597e089-podres\") pod \"perf-node-gather-daemonset-dl4mg\" (UID: \"2b28422d-4d2f-44e3-ae49-62eab597e089\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:25.105870 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.105841 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sghc6\" (UniqueName: \"kubernetes.io/projected/2b28422d-4d2f-44e3-ae49-62eab597e089-kube-api-access-sghc6\") pod \"perf-node-gather-daemonset-dl4mg\" (UID: \"2b28422d-4d2f-44e3-ae49-62eab597e089\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:25.106178 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.105901 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b28422d-4d2f-44e3-ae49-62eab597e089-sys\") pod \"perf-node-gather-daemonset-dl4mg\" (UID: \"2b28422d-4d2f-44e3-ae49-62eab597e089\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:25.106178 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.105942 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2b28422d-4d2f-44e3-ae49-62eab597e089-lib-modules\") pod \"perf-node-gather-daemonset-dl4mg\" (UID: \"2b28422d-4d2f-44e3-ae49-62eab597e089\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:25.107537 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.107517 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg"] Apr 16 15:04:25.206244 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.206215 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2b28422d-4d2f-44e3-ae49-62eab597e089-proc\") pod \"perf-node-gather-daemonset-dl4mg\" (UID: \"2b28422d-4d2f-44e3-ae49-62eab597e089\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:25.206244 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.206248 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2b28422d-4d2f-44e3-ae49-62eab597e089-podres\") pod \"perf-node-gather-daemonset-dl4mg\" (UID: \"2b28422d-4d2f-44e3-ae49-62eab597e089\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:25.206431 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.206264 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sghc6\" (UniqueName: \"kubernetes.io/projected/2b28422d-4d2f-44e3-ae49-62eab597e089-kube-api-access-sghc6\") pod \"perf-node-gather-daemonset-dl4mg\" (UID: \"2b28422d-4d2f-44e3-ae49-62eab597e089\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:25.206431 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.206294 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b28422d-4d2f-44e3-ae49-62eab597e089-sys\") pod \"perf-node-gather-daemonset-dl4mg\" (UID: \"2b28422d-4d2f-44e3-ae49-62eab597e089\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:25.206431 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.206326 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2b28422d-4d2f-44e3-ae49-62eab597e089-proc\") pod \"perf-node-gather-daemonset-dl4mg\" (UID: \"2b28422d-4d2f-44e3-ae49-62eab597e089\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:25.206431 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.206330 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2b28422d-4d2f-44e3-ae49-62eab597e089-lib-modules\") pod \"perf-node-gather-daemonset-dl4mg\" (UID: \"2b28422d-4d2f-44e3-ae49-62eab597e089\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:25.206431 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.206399 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2b28422d-4d2f-44e3-ae49-62eab597e089-podres\") pod \"perf-node-gather-daemonset-dl4mg\" (UID: \"2b28422d-4d2f-44e3-ae49-62eab597e089\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:25.206431 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.206405 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2b28422d-4d2f-44e3-ae49-62eab597e089-lib-modules\") pod \"perf-node-gather-daemonset-dl4mg\" (UID: \"2b28422d-4d2f-44e3-ae49-62eab597e089\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:25.206630 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.206427 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b28422d-4d2f-44e3-ae49-62eab597e089-sys\") pod \"perf-node-gather-daemonset-dl4mg\" (UID: \"2b28422d-4d2f-44e3-ae49-62eab597e089\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:25.213888 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.213869 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sghc6\" (UniqueName: \"kubernetes.io/projected/2b28422d-4d2f-44e3-ae49-62eab597e089-kube-api-access-sghc6\") pod \"perf-node-gather-daemonset-dl4mg\" (UID: \"2b28422d-4d2f-44e3-ae49-62eab597e089\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:25.410461 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.410392 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:25.524730 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.524710 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg"] Apr 16 15:04:25.527350 ip-10-0-128-29 kubenswrapper[2582]: W0416 15:04:25.527324 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2b28422d_4d2f_44e3_ae49_62eab597e089.slice/crio-280011846e17ec8440d950701f78cfd4d36fe62e9195c8396cab586c0267436e WatchSource:0}: Error finding container 280011846e17ec8440d950701f78cfd4d36fe62e9195c8396cab586c0267436e: Status 404 returned error can't find the container with id 280011846e17ec8440d950701f78cfd4d36fe62e9195c8396cab586c0267436e Apr 16 15:04:25.528956 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.528936 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:04:25.823381 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.823356 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8cbxc_0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9/dns/0.log" Apr 16 15:04:25.844617 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.844587 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8cbxc_0bcafdf7-3f9d-4f9c-baf2-5a3c0edbf4b9/kube-rbac-proxy/0.log" Apr 16 15:04:25.958606 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:25.958576 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ztqms_1c601f7b-2758-4b61-a47e-bdc41ba6fb31/dns-node-resolver/0.log" Apr 16 15:04:26.200548 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:26.200453 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" event={"ID":"2b28422d-4d2f-44e3-ae49-62eab597e089","Type":"ContainerStarted","Data":"442ca51b33c9a5ac4ffe3b2c8cc9ac78f08de7cbc1ebced86bdc4e1e9c7e458a"} Apr 16 15:04:26.200548 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:26.200489 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" event={"ID":"2b28422d-4d2f-44e3-ae49-62eab597e089","Type":"ContainerStarted","Data":"280011846e17ec8440d950701f78cfd4d36fe62e9195c8396cab586c0267436e"} Apr 16 15:04:26.201031 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:26.200555 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:26.215585 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:26.215542 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" podStartSLOduration=1.215530795 podStartE2EDuration="1.215530795s" podCreationTimestamp="2026-04-16 15:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:04:26.213413319 +0000 UTC m=+3919.886314612" watchObservedRunningTime="2026-04-16 15:04:26.215530795 +0000 UTC m=+3919.888432089" Apr 16 15:04:26.417900 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:26.417871 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sqfgf_56fe5eb2-ae67-4d8a-a719-f51bf68da0d0/node-ca/0.log" Apr 16 15:04:27.657711 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:27.657674 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vf79m_41223147-714d-4ec2-a7b7-5febd776c247/serve-healthcheck-canary/0.log" Apr 16 15:04:28.197219 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:28.197193 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gpnpx_02fa5e24-8818-4d97-9a44-c85c3daf42a9/kube-rbac-proxy/0.log" Apr 16 15:04:28.220496 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:28.220465 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gpnpx_02fa5e24-8818-4d97-9a44-c85c3daf42a9/exporter/0.log" Apr 16 15:04:28.256201 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:28.256172 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gpnpx_02fa5e24-8818-4d97-9a44-c85c3daf42a9/extractor/0.log" Apr 16 15:04:32.215504 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:32.215478 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-dl4mg" Apr 16 15:04:35.504159 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:35.504133 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-h82nj_f724d942-1eee-4167-a883-bbc5be00af26/kube-storage-version-migrator-operator/1.log" Apr 16 15:04:35.504988 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:35.504971 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-h82nj_f724d942-1eee-4167-a883-bbc5be00af26/kube-storage-version-migrator-operator/0.log" Apr 16 15:04:36.660452 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:36.660426 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pjcmk_dfd38cb4-73f0-4cb1-a3ee-4f877e37742f/kube-multus-additional-cni-plugins/0.log" Apr 16 15:04:36.685663 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:36.685640 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pjcmk_dfd38cb4-73f0-4cb1-a3ee-4f877e37742f/egress-router-binary-copy/0.log" Apr 16 15:04:36.712533 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:36.712511 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pjcmk_dfd38cb4-73f0-4cb1-a3ee-4f877e37742f/cni-plugins/0.log" Apr 16 15:04:36.737616 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:36.737599 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pjcmk_dfd38cb4-73f0-4cb1-a3ee-4f877e37742f/bond-cni-plugin/0.log" Apr 16 15:04:36.760642 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:36.760627 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pjcmk_dfd38cb4-73f0-4cb1-a3ee-4f877e37742f/routeoverride-cni/0.log" Apr 16 15:04:36.787910 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:36.787883 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pjcmk_dfd38cb4-73f0-4cb1-a3ee-4f877e37742f/whereabouts-cni-bincopy/0.log" Apr 16 15:04:36.812243 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:36.812225 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pjcmk_dfd38cb4-73f0-4cb1-a3ee-4f877e37742f/whereabouts-cni/0.log" Apr 16 15:04:37.010008 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:37.009978 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pq9ch_2b62492e-79c4-4431-a7af-4bcaa0f1c8aa/kube-multus/0.log" Apr 16 15:04:37.167037 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:37.167003 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rt77p_86d416f7-1028-4d19-9a65-2ecc6960eeb7/network-metrics-daemon/0.log" Apr 16 15:04:37.190192 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:37.190172 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rt77p_86d416f7-1028-4d19-9a65-2ecc6960eeb7/kube-rbac-proxy/0.log" Apr 16 15:04:38.239016 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:38.238981 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-controller/0.log" Apr 16 15:04:38.260398 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:38.260379 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/0.log" Apr 16 15:04:38.280563 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:38.280541 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovn-acl-logging/1.log" Apr 16 15:04:38.303618 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:38.303593 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/kube-rbac-proxy-node/0.log" Apr 16 15:04:38.327396 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:38.327375 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 15:04:38.354319 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:38.354301 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/northd/0.log" Apr 16 15:04:38.377237 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:38.377217 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/nbdb/0.log" Apr 16 15:04:38.406901 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:38.406880 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/sbdb/0.log" Apr 16 15:04:38.498494 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:38.498426 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-np5bh_ff8a1d02-93a9-4b2d-a7f6-95a4d86e0fe9/ovnkube-controller/0.log" Apr 16 15:04:39.913670 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:39.913643 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qpb5w_a0c4f1c8-43b6-4596-a619-0dd4cba798af/network-check-target-container/0.log" Apr 16 15:04:40.853644 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:40.853618 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-k697d_42370dc2-2d36-49c1-b178-6763d784a3e0/iptables-alerter/0.log" Apr 16 15:04:41.577354 ip-10-0-128-29 kubenswrapper[2582]: I0416 15:04:41.577320 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-ff2qp_2ea59d91-2e6f-4f2d-b4dd-88ecb1bf00df/tuned/0.log"