Apr 22 18:36:44.162267 ip-10-0-129-142 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:36:44.645368 ip-10-0-129-142 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:36:44.645368 ip-10-0-129-142 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:36:44.645368 ip-10-0-129-142 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:36:44.645368 ip-10-0-129-142 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:36:44.645368 ip-10-0-129-142 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:36:44.648908 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.648812 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:36:44.655290 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655260 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:44.655290 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655285 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:44.655290 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655288 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:44.655290 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655291 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:44.655290 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655295 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:44.655290 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655299 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:44.655290 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655303 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655306 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655309 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655312 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655315 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655318 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655320 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655323 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655325 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655329 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655333 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655336 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655339 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655342 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655345 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655347 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655350 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655353 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655355 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655358 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:44.655560 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655360 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655363 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655365 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655369 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655371 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655374 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655377 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655380 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655382 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655385 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655387 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655391 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655394 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655396 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655399 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655401 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655404 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655406 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655409 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655413 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:44.656100 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655416 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655419 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655421 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655424 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655426 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655429 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655432 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655434 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655436 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655439 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655442 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655444 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655447 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655450 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655452 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655455 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655458 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655460 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655463 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655466 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:44.656607 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655469 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655471 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655474 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655477 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655479 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655482 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655485 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655490 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655494 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655497 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655500 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655503 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655511 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655514 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655517 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655519 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655521 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655524 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655527 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:44.657104 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655529 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655986 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655992 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655995 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.655998 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656000 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656003 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656006 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656009 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656013 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656017 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656020 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656023 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656026 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656028 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656031 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656034 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656037 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656040 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:44.657568 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656042 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656045 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656048 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656050 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656053 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656055 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656059 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656061 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656064 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656066 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656069 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656071 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656073 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656076 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656078 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656081 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656083 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656086 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656088 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656091 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:44.658040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656093 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656095 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656098 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656100 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656104 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656107 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656109 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656112 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656114 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656117 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656119 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656122 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656124 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656129 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656132 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656135 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656138 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656141 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656144 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656147 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:44.658527 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656150 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656152 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656156 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656158 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656161 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656163 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656166 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656168 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656171 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656173 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656176 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656178 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656181 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656183 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656186 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656188 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656192 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656195 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656197 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656200 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:44.659116 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656202 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656205 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656207 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656210 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656212 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656215 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656217 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656220 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656298 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656306 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656313 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656317 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656325 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656329 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656334 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656338 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656342 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656345 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656348 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656352 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656355 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656358 2575 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:36:44.659608 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656361 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656363 2575 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656366 2575 flags.go:64] FLAG: --cloud-config="" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656369 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656372 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656377 2575 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656381 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656384 2575 flags.go:64] FLAG: --config-dir="" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656387 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656390 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656394 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656397 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656400 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656404 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656407 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656409 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656412 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656416 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656418 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656423 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656426 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656429 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656431 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656435 2575 flags.go:64] FLAG: --enable-server="true" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656438 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:36:44.660173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656442 2575 flags.go:64] FLAG: --event-burst="100" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656445 2575 flags.go:64] FLAG: --event-qps="50" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656448 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656452 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656454 2575 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656463 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656466 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656469 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656472 2575 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656475 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656478 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656481 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656484 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656487 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656490 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656493 2575 flags.go:64] FLAG: --feature-gates="" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656497 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656500 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656503 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656506 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656508 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656511 2575 flags.go:64] FLAG: --help="false" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656514 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-129-142.ec2.internal" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656517 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:36:44.660789 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656521 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656523 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656527 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656530 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656533 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656535 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656538 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656541 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656544 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656547 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656550 2575 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656553 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656556 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656560 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656563 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656565 2575 flags.go:64] FLAG: --lock-file="" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656568 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656571 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656574 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656579 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656582 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656586 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656589 2575 flags.go:64] FLAG: --logging-format="text" Apr 22 18:36:44.661435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656592 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656595 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656598 2575 flags.go:64] FLAG: --manifest-url="" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656617 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656623 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656627 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656631 2575 flags.go:64] FLAG: --max-pods="110" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656635 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656638 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656641 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656643 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656646 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656649 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656652 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656660 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656663 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656666 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656670 2575 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656673 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656679 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656682 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656685 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656688 2575 flags.go:64] FLAG: --port="10250" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656692 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:36:44.662017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656695 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0884a2fd3b8636cad" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656698 2575 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656701 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656704 2575 flags.go:64] FLAG: --register-node="true" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656707 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656710 2575 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656715 2575 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656718 2575 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656720 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656723 2575 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656727 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656730 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656733 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656736 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656738 2575 flags.go:64] FLAG: --runonce="false" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656741 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656744 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656747 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656749 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656752 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656756 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656759 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656762 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656765 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656768 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656770 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:36:44.662592 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656774 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656777 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656779 2575 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656782 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656788 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656792 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656794 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656799 2575 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656802 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656804 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656807 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656810 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656815 2575 flags.go:64] FLAG: --v="2" Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656819 2575 flags.go:64] FLAG: --version="false" Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656823 2575 flags.go:64] FLAG: --vmodule="" Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656827 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.656831 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656940 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656945 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656949 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656952 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656955 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656958 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656960 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:44.663274 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656963 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656966 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656968 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656971 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656973 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656976 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656978 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656981 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656984 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656987 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656990 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656995 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.656999 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657003 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657007 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657010 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657013 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657016 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657018 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:44.663877 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657021 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657025 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657027 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657030 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657033 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657035 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657038 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657040 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657043 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657045 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657048 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657050 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657053 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657055 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657058 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657061 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657063 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657066 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657068 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657071 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:44.664371 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657073 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657076 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657078 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657081 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657083 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657086 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657089 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657092 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657094 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657097 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657100 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657102 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657105 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657109 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657112 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657114 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657117 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657119 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657122 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657125 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:44.664867 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657127 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657130 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657132 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657134 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657137 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657139 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657142 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657144 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657147 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657149 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657151 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657154 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657156 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657159 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657161 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657163 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657168 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657170 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657174 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:44.665375 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.657176 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:44.665842 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.658045 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:36:44.665842 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.665020 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:36:44.665842 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.665040 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:36:44.665842 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665091 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:44.665842 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665097 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:44.665842 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665100 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:44.665842 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665104 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:44.665842 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665107 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:44.665842 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665109 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:44.665842 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665112 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:44.665842 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665115 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:44.665842 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665118 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:44.665842 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665121 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:44.665842 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665124 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:44.665842 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665126 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665129 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665132 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665134 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665137 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665139 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665142 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665144 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665147 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665149 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665152 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665154 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665157 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665159 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665162 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665165 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665168 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665170 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665173 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665175 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:44.666262 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665179 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665181 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665184 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665187 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665190 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665192 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665195 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665197 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665199 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665202 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665204 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665207 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665209 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665212 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665214 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665217 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665219 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665222 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665224 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665227 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:44.666750 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665230 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665232 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665235 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665237 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665240 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665242 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665245 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665248 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665250 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665253 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665255 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665258 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665261 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665264 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665267 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665270 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665272 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665275 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665277 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:44.667288 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665281 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:44.667747 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665285 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:44.667747 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665288 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:44.667747 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665291 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:44.667747 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665293 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:44.667747 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665296 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:44.667747 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665298 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:44.667747 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665301 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:44.667747 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665305 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:44.667747 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665308 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:44.667747 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665311 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:44.667747 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665314 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:44.667747 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665317 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:44.667747 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665320 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:44.667747 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665322 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:44.667747 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665325 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:44.668123 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.665331 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:36:44.668123 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665438 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:44.668123 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665443 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:44.668123 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665446 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:44.668123 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665448 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:44.668123 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665452 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:44.668123 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665454 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:44.668123 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665457 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:44.668123 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665459 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:44.668123 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665462 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:44.668123 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665465 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:44.668123 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665467 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:44.668123 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665470 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:44.668123 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665472 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:44.668123 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665475 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665477 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665480 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665483 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665485 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665487 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665490 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665492 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665495 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665497 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665499 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665502 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665504 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665507 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665509 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665511 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665514 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665516 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665519 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665532 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:44.668485 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665536 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665538 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665541 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665544 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665547 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665549 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665552 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665555 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665557 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665560 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665562 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665565 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665567 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665570 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665572 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665575 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665577 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665580 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665582 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665584 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:44.669039 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665587 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665589 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665592 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665594 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665597 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665599 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665601 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665604 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665606 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665609 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665611 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665613 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665616 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665619 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665621 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665624 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665626 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665630 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665633 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665636 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:44.669530 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665638 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:44.670040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665642 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:44.670040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665646 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:44.670040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665648 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:44.670040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665651 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:44.670040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665654 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:44.670040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665657 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:44.670040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665659 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:44.670040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665662 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:44.670040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665665 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:44.670040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665668 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:44.670040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665670 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:44.670040 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:44.665674 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:44.670040 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.665679 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:36:44.670040 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.666462 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:36:44.670383 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.668497 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:36:44.670383 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.669497 2575 server.go:1019] "Starting client certificate rotation" Apr 22 18:36:44.670383 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.669607 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:36:44.670383 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.669638 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:36:44.696642 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.696617 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:36:44.700019 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.699989 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:36:44.713319 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.713288 2575 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:36:44.720431 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.720391 2575 log.go:25] "Validated CRI v1 image API" Apr 22 18:36:44.721754 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.721730 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:36:44.727573 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.727550 2575 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9691633f-b7f2-47dd-9798-ca83bd3e293b:/dev/nvme0n1p4 9a0399a0-df3a-46cc-9b6f-439cb9d914ca:/dev/nvme0n1p3] Apr 22 18:36:44.727671 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.727573 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:36:44.727808 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.727792 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:36:44.734468 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.734337 2575 manager.go:217] Machine: {Timestamp:2026-04-22 18:36:44.731331458 +0000 UTC m=+0.438553877 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099153 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2145eb1adb36e2e42514eccea298b8 SystemUUID:ec2145eb-1adb-36e2-e425-14eccea298b8 BootID:56b49e86-e57c-4e8b-9ee8-72cc7a61571a Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:75:38:30:3b:d5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:75:38:30:3b:d5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a2:bf:2f:83:ee:d8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:36:44.734468 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.734458 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:36:44.734590 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.734552 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:36:44.735919 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.735887 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:36:44.736095 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.735944 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-142.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:36:44.736137 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.736112 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:36:44.736137 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.736121 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:36:44.736194 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.736138 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:36:44.736950 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.736937 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:36:44.738034 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.738023 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:36:44.738148 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.738139 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:36:44.740751 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.740737 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:36:44.740795 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.740755 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:36:44.740795 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.740770 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:36:44.740795 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.740780 2575 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:36:44.740795 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.740789 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:36:44.742004 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.741991 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:36:44.742039 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.742012 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:36:44.746233 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.746216 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:36:44.747772 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.747755 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:36:44.748235 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.748214 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-c4tsb" Apr 22 18:36:44.749572 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.749556 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:36:44.749645 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.749594 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:36:44.749645 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.749601 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:36:44.749645 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.749607 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:36:44.749645 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.749613 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:36:44.749645 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.749618 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:36:44.749645 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.749624 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:36:44.749645 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.749629 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:36:44.749645 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.749636 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:36:44.749645 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.749643 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:36:44.749868 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.749656 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:36:44.749868 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.749665 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:36:44.750641 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.750630 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:36:44.750720 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.750644 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:36:44.753776 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:44.753736 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:36:44.753941 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:44.753733 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:36:44.754253 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.754227 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:36:44.754662 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.754634 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:36:44.754753 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.754697 2575 server.go:1295] "Started kubelet" Apr 22 18:36:44.754968 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.754914 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:36:44.755023 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.754945 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:36:44.755080 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.755025 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:36:44.755727 ip-10-0-129-142 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:36:44.756429 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.756413 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:36:44.757499 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.757479 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-c4tsb" Apr 22 18:36:44.758508 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.758484 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:36:44.764286 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.764262 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:36:44.764286 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.764278 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:36:44.764891 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.764873 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:36:44.764891 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.764875 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:36:44.765036 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.764905 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:36:44.765036 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.764978 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:36:44.765036 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.764987 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:36:44.765378 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:44.765361 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-142.ec2.internal\" not found" Apr 22 18:36:44.767642 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.767625 2575 factory.go:55] Registering systemd factory Apr 22 18:36:44.767720 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.767675 2575 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:36:44.767998 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.767981 2575 factory.go:153] Registering CRI-O factory Apr 22 18:36:44.768112 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.768094 2575 factory.go:223] Registration of the crio container factory successfully Apr 22 18:36:44.768200 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.768158 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:36:44.768200 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.768182 2575 factory.go:103] Registering Raw factory Apr 22 18:36:44.768200 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.768197 2575 manager.go:1196] Started watching for new ooms in manager Apr 22 18:36:44.768452 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:44.768030 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:36:44.768817 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.768801 2575 manager.go:319] Starting recovery of all containers Apr 22 18:36:44.772949 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.772909 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:44.774981 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.774943 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:36:44.778337 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:44.778195 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-142.ec2.internal\" not found" node="ip-10-0-129-142.ec2.internal" Apr 22 18:36:44.778456 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.778216 2575 manager.go:324] Recovery completed Apr 22 18:36:44.783872 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.783856 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:44.786347 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.786333 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:44.786417 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.786365 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:44.786417 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.786376 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:44.786871 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.786858 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:36:44.786914 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.786871 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:36:44.786914 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.786891 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:36:44.789568 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.789555 2575 policy_none.go:49] "None policy: Start" Apr 22 18:36:44.789617 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.789572 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:36:44.789617 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.789582 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:36:44.829631 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.829609 2575 manager.go:341] "Starting Device Plugin manager" Apr 22 18:36:44.838174 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:44.829648 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:36:44.838174 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.829658 2575 server.go:85] "Starting device plugin registration server" Apr 22 18:36:44.838174 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.829952 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:36:44.838174 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.829964 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:36:44.838174 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.830061 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:36:44.838174 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.830147 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:36:44.838174 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.830155 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:36:44.838174 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:44.830910 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:36:44.838174 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:44.830975 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-142.ec2.internal\" not found" Apr 22 18:36:44.928800 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.928766 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:36:44.928800 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.928803 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:36:44.929013 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.928826 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:36:44.929013 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.928832 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:36:44.929013 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:44.928887 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:36:44.930114 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.930094 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:44.930884 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.930862 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:44.931005 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.930898 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:44.931005 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.930911 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:44.931005 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.930957 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-142.ec2.internal" Apr 22 18:36:44.931967 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.931951 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:44.940269 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:44.940250 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-142.ec2.internal" Apr 22 18:36:44.940367 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:44.940275 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-142.ec2.internal\": node \"ip-10-0-129-142.ec2.internal\" not found" Apr 22 18:36:44.953580 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:44.953547 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-142.ec2.internal\" not found" Apr 22 18:36:45.029757 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.029700 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-129-142.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal"] Apr 22 18:36:45.029959 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.029794 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:45.031023 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.031003 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:45.031124 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.031038 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:45.031124 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.031047 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:45.032415 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.032402 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:45.032596 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.032582 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-142.ec2.internal" Apr 22 18:36:45.032646 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.032612 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:45.033748 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.033717 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:45.033748 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.033749 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:45.033898 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.033759 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:45.033898 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.033718 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:45.033898 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.033827 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:45.033898 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.033838 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:45.035131 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.035118 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal" Apr 22 18:36:45.035174 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.035144 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:45.035902 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.035885 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:45.036006 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.035911 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:45.036006 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.035936 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:45.053823 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:45.053783 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-142.ec2.internal\" not found" Apr 22 18:36:45.061893 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:45.061860 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-142.ec2.internal\" not found" node="ip-10-0-129-142.ec2.internal" Apr 22 18:36:45.066329 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:45.066309 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-142.ec2.internal\" not found" node="ip-10-0-129-142.ec2.internal" Apr 22 18:36:45.067361 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.067343 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8c3f3197044afe14bb61b56250727776-config\") pod \"kube-apiserver-proxy-ip-10-0-129-142.ec2.internal\" (UID: \"8c3f3197044afe14bb61b56250727776\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-142.ec2.internal" Apr 22 18:36:45.067412 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.067370 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/83e556b964a4cb65f16be5aa8fc1307d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal\" (UID: \"83e556b964a4cb65f16be5aa8fc1307d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal" Apr 22 18:36:45.067412 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.067391 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/83e556b964a4cb65f16be5aa8fc1307d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal\" (UID: \"83e556b964a4cb65f16be5aa8fc1307d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal" Apr 22 18:36:45.154662 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:45.154627 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-142.ec2.internal\" not found" Apr 22 18:36:45.167733 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.167702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/83e556b964a4cb65f16be5aa8fc1307d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal\" (UID: \"83e556b964a4cb65f16be5aa8fc1307d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal" Apr 22 18:36:45.167792 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.167739 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/83e556b964a4cb65f16be5aa8fc1307d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal\" (UID: \"83e556b964a4cb65f16be5aa8fc1307d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal" Apr 22 18:36:45.167792 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.167755 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8c3f3197044afe14bb61b56250727776-config\") pod \"kube-apiserver-proxy-ip-10-0-129-142.ec2.internal\" (UID: \"8c3f3197044afe14bb61b56250727776\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-142.ec2.internal" Apr 22 18:36:45.167854 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.167807 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8c3f3197044afe14bb61b56250727776-config\") pod \"kube-apiserver-proxy-ip-10-0-129-142.ec2.internal\" (UID: \"8c3f3197044afe14bb61b56250727776\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-142.ec2.internal" Apr 22 18:36:45.167854 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.167806 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/83e556b964a4cb65f16be5aa8fc1307d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal\" (UID: \"83e556b964a4cb65f16be5aa8fc1307d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal" Apr 22 18:36:45.167912 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.167809 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/83e556b964a4cb65f16be5aa8fc1307d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal\" (UID: \"83e556b964a4cb65f16be5aa8fc1307d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal" Apr 22 18:36:45.255706 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:45.255633 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-142.ec2.internal\" not found" Apr 22 18:36:45.356247 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:45.356216 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-142.ec2.internal\" not found" Apr 22 18:36:45.364429 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.364408 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-142.ec2.internal" Apr 22 18:36:45.369094 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.369064 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal" Apr 22 18:36:45.456975 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:45.456909 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-142.ec2.internal\" not found" Apr 22 18:36:45.557415 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:45.557351 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-142.ec2.internal\" not found" Apr 22 18:36:45.657869 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:45.657836 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-142.ec2.internal\" not found" Apr 22 18:36:45.669154 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.669127 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:36:45.669342 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.669320 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:36:45.669394 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.669320 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:36:45.758655 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:45.758621 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-142.ec2.internal\" not found" Apr 22 18:36:45.759783 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.759716 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:31:44 +0000 UTC" deadline="2027-11-11 03:48:08.963342203 +0000 UTC" Apr 22 18:36:45.759783 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.759756 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13617h11m23.20359194s" Apr 22 18:36:45.764475 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.764452 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:36:45.777111 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.777089 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:36:45.784269 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.784245 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:45.822147 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.822125 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-f48np" Apr 22 18:36:45.831141 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:45.831106 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c3f3197044afe14bb61b56250727776.slice/crio-4050d7ae6930d68ec213a7ba7bd4f304d46a9aff147feece1284326ac70896bd WatchSource:0}: Error finding container 4050d7ae6930d68ec213a7ba7bd4f304d46a9aff147feece1284326ac70896bd: Status 404 returned error can't find the container with id 4050d7ae6930d68ec213a7ba7bd4f304d46a9aff147feece1284326ac70896bd Apr 22 18:36:45.835421 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.835404 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:36:45.835581 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.835563 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-f48np" Apr 22 18:36:45.859631 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:45.859600 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-142.ec2.internal\" not found" Apr 22 18:36:45.932815 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:45.932754 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-142.ec2.internal" event={"ID":"8c3f3197044afe14bb61b56250727776","Type":"ContainerStarted","Data":"4050d7ae6930d68ec213a7ba7bd4f304d46a9aff147feece1284326ac70896bd"} Apr 22 18:36:45.935602 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:45.935570 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83e556b964a4cb65f16be5aa8fc1307d.slice/crio-f1f6005dcac735b3be56a4ec07ec3d91d4432fec205965c46978f1870b519ae1 WatchSource:0}: Error finding container f1f6005dcac735b3be56a4ec07ec3d91d4432fec205965c46978f1870b519ae1: Status 404 returned error can't find the container with id f1f6005dcac735b3be56a4ec07ec3d91d4432fec205965c46978f1870b519ae1 Apr 22 18:36:45.960098 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:45.960066 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-142.ec2.internal\" not found" Apr 22 18:36:46.060565 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:46.060544 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-142.ec2.internal\" not found" Apr 22 18:36:46.161145 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:46.161111 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-142.ec2.internal\" not found" Apr 22 18:36:46.261956 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:46.261906 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-142.ec2.internal\" not found" Apr 22 18:36:46.327377 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.327344 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:46.365525 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.365289 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal" Apr 22 18:36:46.382445 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.382408 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:36:46.386137 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.386107 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-142.ec2.internal" Apr 22 18:36:46.396083 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.396051 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:36:46.597859 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.597765 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:46.742155 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.742112 2575 apiserver.go:52] "Watching apiserver" Apr 22 18:36:46.749939 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.749900 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:36:46.751236 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.751211 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-f4ls2","openshift-ovn-kubernetes/ovnkube-node-zbqgd","kube-system/kube-apiserver-proxy-ip-10-0-129-142.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs","openshift-dns/node-resolver-vnqvq","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal","openshift-multus/multus-additional-cni-plugins-krfvd","openshift-multus/multus-qjb4s","openshift-multus/network-metrics-daemon-sx6lq","openshift-network-diagnostics/network-check-target-xg9qw","kube-system/konnectivity-agent-tdjfh","openshift-cluster-node-tuning-operator/tuned-vgxcc","openshift-image-registry/node-ca-smstv"] Apr 22 18:36:46.753332 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.753312 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.754587 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.754565 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-smstv" Apr 22 18:36:46.755838 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.755815 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.756232 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.756207 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:36:46.756334 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.756240 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:36:46.756334 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.756259 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mgxxx\"" Apr 22 18:36:46.756334 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.756305 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:36:46.756993 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.756973 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:36:46.757109 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.757030 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:36:46.757109 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.757040 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-m6fn9\"" Apr 22 18:36:46.757299 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.757281 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:36:46.757359 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.757288 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.759054 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.758570 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:36:46.759054 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.758643 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vnqvq" Apr 22 18:36:46.759054 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.758710 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lqdrk\"" Apr 22 18:36:46.759054 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.758826 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:36:46.760262 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.759785 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f4ls2" Apr 22 18:36:46.760262 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.759964 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:36:46.760262 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.760051 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:36:46.760262 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.760102 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dht7c\"" Apr 22 18:36:46.761912 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.761034 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:36:46.761912 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.761163 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:36:46.761912 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.761180 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:36:46.762378 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.762346 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.763054 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.762705 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:36:46.763054 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.762874 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:36:46.763054 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.762892 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:36:46.763054 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.763015 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:36:46.763299 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.763061 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6fjr7\"" Apr 22 18:36:46.763299 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.763110 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-44zj9\"" Apr 22 18:36:46.763299 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.763226 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:36:46.764157 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.764142 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:36:46.764532 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:46.764501 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:36:46.765348 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.765325 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:36:46.765444 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.765380 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-6r4sp\"" Apr 22 18:36:46.765852 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.765835 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:36:46.766057 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:46.766033 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:36:46.767885 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.767866 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tdjfh" Apr 22 18:36:46.769310 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.769288 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.771432 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.771409 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:36:46.771734 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.771718 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:36:46.772083 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.771945 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fprjx\"" Apr 22 18:36:46.772625 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.772608 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:36:46.772872 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.772854 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-wcfs2\"" Apr 22 18:36:46.774566 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.774546 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:36:46.774669 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.774600 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:36:46.774669 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.774610 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:36:46.775943 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.775906 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-registration-dir\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.776027 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.775954 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-etc-selinux\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.776027 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.775979 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-sysconfig\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.776027 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776018 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-systemd\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.776173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776040 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-lib-modules\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.776173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776062 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-host-run-multus-certs\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.776173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776087 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74czq\" (UniqueName: \"kubernetes.io/projected/828d62db-71ec-4b5b-b36f-70124c98ab1b-kube-api-access-74czq\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.776173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776110 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:36:46.776173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776120 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:36:46.776173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776130 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-modprobe-d\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.776451 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776178 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-run\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.776451 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776201 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fb9d517f-20af-4295-b872-990841be4bb8-iptables-alerter-script\") pod \"iptables-alerter-f4ls2\" (UID: \"fb9d517f-20af-4295-b872-990841be4bb8\") " pod="openshift-network-operator/iptables-alerter-f4ls2" Apr 22 18:36:46.776451 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-host-var-lib-kubelet\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.776451 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776234 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17-hosts-file\") pod \"node-resolver-vnqvq\" (UID: \"6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17\") " pod="openshift-dns/node-resolver-vnqvq" Apr 22 18:36:46.776451 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776258 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-sys-fs\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.776451 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776276 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-kubernetes\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.776451 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776298 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-system-cni-dir\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.776451 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776317 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-host-var-lib-cni-multus\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.776451 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776332 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-sys\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.776451 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776363 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-var-lib-kubelet\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.776451 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776401 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/90bf1722-06bc-4b40-aeb1-0f3d321b412b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.776451 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776429 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-multus-cni-dir\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.776451 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776449 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-etc-kubernetes\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.777097 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776464 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs\") pod \"network-metrics-daemon-sx6lq\" (UID: \"261a5574-d8e0-4ff1-99d3-ac73ec229f92\") " pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:36:46.777097 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776479 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r267j\" (UniqueName: \"kubernetes.io/projected/6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17-kube-api-access-r267j\") pod \"node-resolver-vnqvq\" (UID: \"6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17\") " pod="openshift-dns/node-resolver-vnqvq" Apr 22 18:36:46.777097 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776497 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2b064dca-dd29-4d80-aaba-ffe324b856eb-serviceca\") pod \"node-ca-smstv\" (UID: \"2b064dca-dd29-4d80-aaba-ffe324b856eb\") " pod="openshift-image-registry/node-ca-smstv" Apr 22 18:36:46.777097 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776516 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd94de2c-a623-493f-b731-6e8e75d73ccb-tmp\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.777097 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776531 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/90bf1722-06bc-4b40-aeb1-0f3d321b412b-cni-binary-copy\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.777097 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776545 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjjq2\" (UniqueName: \"kubernetes.io/projected/fb9d517f-20af-4295-b872-990841be4bb8-kube-api-access-jjjq2\") pod \"iptables-alerter-f4ls2\" (UID: \"fb9d517f-20af-4295-b872-990841be4bb8\") " pod="openshift-network-operator/iptables-alerter-f4ls2" Apr 22 18:36:46.777097 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776561 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb9d517f-20af-4295-b872-990841be4bb8-host-slash\") pod \"iptables-alerter-f4ls2\" (UID: \"fb9d517f-20af-4295-b872-990841be4bb8\") " pod="openshift-network-operator/iptables-alerter-f4ls2" Apr 22 18:36:46.777097 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776583 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/828d62db-71ec-4b5b-b36f-70124c98ab1b-cni-binary-copy\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.777097 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776612 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-host-run-k8s-cni-cncf-io\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.777097 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776634 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.777097 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776654 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-tuned\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.777097 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776688 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-cnibin\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.777097 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776725 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-os-release\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.777097 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776750 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-multus-conf-dir\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.777097 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776796 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-sysctl-d\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.777097 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776822 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-socket-dir\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.777097 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776856 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-sysctl-conf\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.777877 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776880 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-multus-socket-dir-parent\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.777877 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776902 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-host-run-netns\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.777877 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776963 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-host-var-lib-cni-bin\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.777877 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.776988 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-device-dir\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.777877 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.777003 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b064dca-dd29-4d80-aaba-ffe324b856eb-host\") pod \"node-ca-smstv\" (UID: \"2b064dca-dd29-4d80-aaba-ffe324b856eb\") " pod="openshift-image-registry/node-ca-smstv" Apr 22 18:36:46.777877 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.777016 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-host\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.777877 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.777035 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/90bf1722-06bc-4b40-aeb1-0f3d321b412b-system-cni-dir\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.777877 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.777072 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzvhd\" (UniqueName: \"kubernetes.io/projected/90bf1722-06bc-4b40-aeb1-0f3d321b412b-kube-api-access-jzvhd\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.777877 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.777094 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-hostroot\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.777877 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.777109 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/828d62db-71ec-4b5b-b36f-70124c98ab1b-multus-daemon-config\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.777877 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.777123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/90bf1722-06bc-4b40-aeb1-0f3d321b412b-os-release\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.777877 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.777154 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/90bf1722-06bc-4b40-aeb1-0f3d321b412b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.777877 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.777173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf6xz\" (UniqueName: \"kubernetes.io/projected/261a5574-d8e0-4ff1-99d3-ac73ec229f92-kube-api-access-rf6xz\") pod \"network-metrics-daemon-sx6lq\" (UID: \"261a5574-d8e0-4ff1-99d3-ac73ec229f92\") " pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:36:46.777877 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.777188 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6ms7\" (UniqueName: \"kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7\") pod \"network-check-target-xg9qw\" (UID: \"5664213f-7ddf-424a-af58-77eee7d05998\") " pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:36:46.777877 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.777207 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17-tmp-dir\") pod \"node-resolver-vnqvq\" (UID: \"6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17\") " pod="openshift-dns/node-resolver-vnqvq" Apr 22 18:36:46.777877 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.777230 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/90bf1722-06bc-4b40-aeb1-0f3d321b412b-cnibin\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.778631 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.777249 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fttrg\" (UniqueName: \"kubernetes.io/projected/bd94de2c-a623-493f-b731-6e8e75d73ccb-kube-api-access-fttrg\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.778631 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.777264 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bx9l\" (UniqueName: \"kubernetes.io/projected/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-kube-api-access-9bx9l\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.778631 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.777278 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdf6h\" (UniqueName: \"kubernetes.io/projected/2b064dca-dd29-4d80-aaba-ffe324b856eb-kube-api-access-sdf6h\") pod \"node-ca-smstv\" (UID: \"2b064dca-dd29-4d80-aaba-ffe324b856eb\") " pod="openshift-image-registry/node-ca-smstv" Apr 22 18:36:46.778631 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.777309 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/90bf1722-06bc-4b40-aeb1-0f3d321b412b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.784033 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.784003 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:46.836352 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.836313 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:31:45 +0000 UTC" deadline="2028-01-11 18:25:21.816420184 +0000 UTC" Apr 22 18:36:46.836352 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.836344 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15095h48m34.980079049s" Apr 22 18:36:46.865538 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.865450 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:36:46.878273 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-sysctl-conf\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.878448 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878284 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-multus-socket-dir-parent\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.878448 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-multus-socket-dir-parent\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.878448 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878404 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-host-run-netns\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.878448 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878437 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-sysctl-conf\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.878448 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878446 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-host-var-lib-cni-bin\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.878697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878475 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-etc-openvswitch\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.878697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878493 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-run-ovn\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.878697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878504 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-host-run-netns\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.878697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-device-dir\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.878697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878561 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b064dca-dd29-4d80-aaba-ffe324b856eb-host\") pod \"node-ca-smstv\" (UID: \"2b064dca-dd29-4d80-aaba-ffe324b856eb\") " pod="openshift-image-registry/node-ca-smstv" Apr 22 18:36:46.878697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878567 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-device-dir\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.878697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-host\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.878697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878613 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/90bf1722-06bc-4b40-aeb1-0f3d321b412b-system-cni-dir\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.878697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878610 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-host-var-lib-cni-bin\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.878697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jzvhd\" (UniqueName: \"kubernetes.io/projected/90bf1722-06bc-4b40-aeb1-0f3d321b412b-kube-api-access-jzvhd\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.878697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878619 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b064dca-dd29-4d80-aaba-ffe324b856eb-host\") pod \"node-ca-smstv\" (UID: \"2b064dca-dd29-4d80-aaba-ffe324b856eb\") " pod="openshift-image-registry/node-ca-smstv" Apr 22 18:36:46.878697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878649 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-hostroot\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.878697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878663 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-host\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.878697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878682 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/828d62db-71ec-4b5b-b36f-70124c98ab1b-multus-daemon-config\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.878697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878688 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/90bf1722-06bc-4b40-aeb1-0f3d321b412b-system-cni-dir\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.879406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878707 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/90bf1722-06bc-4b40-aeb1-0f3d321b412b-os-release\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.879406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878731 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/90bf1722-06bc-4b40-aeb1-0f3d321b412b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.879406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878734 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-hostroot\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.879406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rf6xz\" (UniqueName: \"kubernetes.io/projected/261a5574-d8e0-4ff1-99d3-ac73ec229f92-kube-api-access-rf6xz\") pod \"network-metrics-daemon-sx6lq\" (UID: \"261a5574-d8e0-4ff1-99d3-ac73ec229f92\") " pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:36:46.879406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878796 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ms7\" (UniqueName: \"kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7\") pod \"network-check-target-xg9qw\" (UID: \"5664213f-7ddf-424a-af58-77eee7d05998\") " pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:36:46.879406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878810 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/90bf1722-06bc-4b40-aeb1-0f3d321b412b-os-release\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.879406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878846 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-run-openvswitch\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.879406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-node-log\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.879406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17-tmp-dir\") pod \"node-resolver-vnqvq\" (UID: \"6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17\") " pod="openshift-dns/node-resolver-vnqvq" Apr 22 18:36:46.879406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878968 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-systemd-units\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.879406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.878991 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/90bf1722-06bc-4b40-aeb1-0f3d321b412b-cnibin\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.879406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879082 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08a25179-34e4-4328-9e3a-dfcc4b1aa177-ovn-node-metrics-cert\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.879406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879108 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88s5t\" (UniqueName: \"kubernetes.io/projected/08a25179-34e4-4328-9e3a-dfcc4b1aa177-kube-api-access-88s5t\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.879406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fttrg\" (UniqueName: \"kubernetes.io/projected/bd94de2c-a623-493f-b731-6e8e75d73ccb-kube-api-access-fttrg\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.879406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879159 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-run-ovn-kubernetes\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.879406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-cni-netd\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.879406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879249 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e513b58f-d876-4d46-a313-2450da792164-agent-certs\") pod \"konnectivity-agent-tdjfh\" (UID: \"e513b58f-d876-4d46-a313-2450da792164\") " pod="kube-system/konnectivity-agent-tdjfh" Apr 22 18:36:46.880184 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879287 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bx9l\" (UniqueName: \"kubernetes.io/projected/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-kube-api-access-9bx9l\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.880184 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879320 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdf6h\" (UniqueName: \"kubernetes.io/projected/2b064dca-dd29-4d80-aaba-ffe324b856eb-kube-api-access-sdf6h\") pod \"node-ca-smstv\" (UID: \"2b064dca-dd29-4d80-aaba-ffe324b856eb\") " pod="openshift-image-registry/node-ca-smstv" Apr 22 18:36:46.880184 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879347 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/90bf1722-06bc-4b40-aeb1-0f3d321b412b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.880184 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879383 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-cni-bin\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.880184 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879409 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-registration-dir\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.880184 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879444 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-etc-selinux\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.880184 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-sysconfig\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.880184 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879478 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/828d62db-71ec-4b5b-b36f-70124c98ab1b-multus-daemon-config\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.880184 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879482 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-systemd\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.880184 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879517 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-lib-modules\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.880184 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879597 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-host-run-multus-certs\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.880184 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879640 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74czq\" (UniqueName: \"kubernetes.io/projected/828d62db-71ec-4b5b-b36f-70124c98ab1b-kube-api-access-74czq\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.880184 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879640 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/90bf1722-06bc-4b40-aeb1-0f3d321b412b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.880184 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879671 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-log-socket\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.880184 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-modprobe-d\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.880184 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879745 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-run\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.880184 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879769 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fb9d517f-20af-4295-b872-990841be4bb8-iptables-alerter-script\") pod \"iptables-alerter-f4ls2\" (UID: \"fb9d517f-20af-4295-b872-990841be4bb8\") " pod="openshift-network-operator/iptables-alerter-f4ls2" Apr 22 18:36:46.881226 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879793 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-host-var-lib-kubelet\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.881226 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879813 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17-tmp-dir\") pod \"node-resolver-vnqvq\" (UID: \"6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17\") " pod="openshift-dns/node-resolver-vnqvq" Apr 22 18:36:46.881226 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879819 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17-hosts-file\") pod \"node-resolver-vnqvq\" (UID: \"6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17\") " pod="openshift-dns/node-resolver-vnqvq" Apr 22 18:36:46.881226 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879880 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.881226 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.879900 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/90bf1722-06bc-4b40-aeb1-0f3d321b412b-cnibin\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.881226 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880018 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-sys-fs\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.881226 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880233 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-host-var-lib-kubelet\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.881226 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880291 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17-hosts-file\") pod \"node-resolver-vnqvq\" (UID: \"6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17\") " pod="openshift-dns/node-resolver-vnqvq" Apr 22 18:36:46.881226 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880330 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-kubernetes\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.881226 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880362 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-system-cni-dir\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.881226 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880365 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-sys-fs\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.881226 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880405 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-host-var-lib-cni-multus\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.881226 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880442 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08a25179-34e4-4328-9e3a-dfcc4b1aa177-ovnkube-config\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.881226 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e513b58f-d876-4d46-a313-2450da792164-konnectivity-ca\") pod \"konnectivity-agent-tdjfh\" (UID: \"e513b58f-d876-4d46-a313-2450da792164\") " pod="kube-system/konnectivity-agent-tdjfh" Apr 22 18:36:46.881226 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880497 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-sys\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.881226 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-var-lib-kubelet\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.881226 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880557 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/90bf1722-06bc-4b40-aeb1-0f3d321b412b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.881995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880604 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-multus-cni-dir\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.881995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-etc-kubernetes\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.881995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880682 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fb9d517f-20af-4295-b872-990841be4bb8-iptables-alerter-script\") pod \"iptables-alerter-f4ls2\" (UID: \"fb9d517f-20af-4295-b872-990841be4bb8\") " pod="openshift-network-operator/iptables-alerter-f4ls2" Apr 22 18:36:46.881995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880686 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs\") pod \"network-metrics-daemon-sx6lq\" (UID: \"261a5574-d8e0-4ff1-99d3-ac73ec229f92\") " pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:36:46.881995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880771 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-lib-modules\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.881995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880853 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-sysconfig\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.881995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880915 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-sys\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.881995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880963 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-registration-dir\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.881995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880938 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-host-var-lib-cni-multus\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.881995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.880985 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-host-run-multus-certs\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.881995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-run\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.881995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881124 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-etc-kubernetes\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.881995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881194 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-modprobe-d\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.881995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881211 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-etc-selinux\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.881995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881207 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r267j\" (UniqueName: \"kubernetes.io/projected/6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17-kube-api-access-r267j\") pod \"node-resolver-vnqvq\" (UID: \"6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17\") " pod="openshift-dns/node-resolver-vnqvq" Apr 22 18:36:46.881995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881265 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-var-lib-kubelet\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.881995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881269 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/90bf1722-06bc-4b40-aeb1-0f3d321b412b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.881995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881215 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-system-cni-dir\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.882742 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881345 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-systemd\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.882742 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881356 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2b064dca-dd29-4d80-aaba-ffe324b856eb-serviceca\") pod \"node-ca-smstv\" (UID: \"2b064dca-dd29-4d80-aaba-ffe324b856eb\") " pod="openshift-image-registry/node-ca-smstv" Apr 22 18:36:46.882742 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881377 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-multus-cni-dir\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.882742 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881393 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-kubernetes\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.882742 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881397 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd94de2c-a623-493f-b731-6e8e75d73ccb-tmp\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.882742 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:46.881346 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:46.882742 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881582 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/90bf1722-06bc-4b40-aeb1-0f3d321b412b-cni-binary-copy\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.882742 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:46.881612 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs podName:261a5574-d8e0-4ff1-99d3-ac73ec229f92 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:47.381572583 +0000 UTC m=+3.088795012 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs") pod "network-metrics-daemon-sx6lq" (UID: "261a5574-d8e0-4ff1-99d3-ac73ec229f92") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:46.882742 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881819 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-run-netns\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.882742 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881861 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjjq2\" (UniqueName: \"kubernetes.io/projected/fb9d517f-20af-4295-b872-990841be4bb8-kube-api-access-jjjq2\") pod \"iptables-alerter-f4ls2\" (UID: \"fb9d517f-20af-4295-b872-990841be4bb8\") " pod="openshift-network-operator/iptables-alerter-f4ls2" Apr 22 18:36:46.882742 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881846 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:36:46.882742 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881895 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-var-lib-openvswitch\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.882742 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881913 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/90bf1722-06bc-4b40-aeb1-0f3d321b412b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.882742 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881945 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08a25179-34e4-4328-9e3a-dfcc4b1aa177-env-overrides\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.882742 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.881980 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08a25179-34e4-4328-9e3a-dfcc4b1aa177-ovnkube-script-lib\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.882742 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882011 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb9d517f-20af-4295-b872-990841be4bb8-host-slash\") pod \"iptables-alerter-f4ls2\" (UID: \"fb9d517f-20af-4295-b872-990841be4bb8\") " pod="openshift-network-operator/iptables-alerter-f4ls2" Apr 22 18:36:46.882742 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882040 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/828d62db-71ec-4b5b-b36f-70124c98ab1b-cni-binary-copy\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.883501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882047 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb9d517f-20af-4295-b872-990841be4bb8-host-slash\") pod \"iptables-alerter-f4ls2\" (UID: \"fb9d517f-20af-4295-b872-990841be4bb8\") " pod="openshift-network-operator/iptables-alerter-f4ls2" Apr 22 18:36:46.883501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882055 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/90bf1722-06bc-4b40-aeb1-0f3d321b412b-cni-binary-copy\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.883501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882091 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-host-run-k8s-cni-cncf-io\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.883501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882126 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-host-run-k8s-cni-cncf-io\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.883501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-kubelet\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.883501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882172 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.883501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882195 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-tuned\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.883501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882217 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-cnibin\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.883501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882243 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-os-release\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.883501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882263 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.883501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882303 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-multus-conf-dir\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.883501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882333 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-cnibin\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.883501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882408 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-os-release\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.883501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882411 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-slash\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.883501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882456 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/828d62db-71ec-4b5b-b36f-70124c98ab1b-multus-conf-dir\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.883501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882470 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-run-systemd\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.883501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882513 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-sysctl-d\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.883501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-socket-dir\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.884055 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882591 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2b064dca-dd29-4d80-aaba-ffe324b856eb-serviceca\") pod \"node-ca-smstv\" (UID: \"2b064dca-dd29-4d80-aaba-ffe324b856eb\") " pod="openshift-image-registry/node-ca-smstv" Apr 22 18:36:46.884055 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.882672 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-socket-dir\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.884055 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.883007 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-sysctl-d\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.884209 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.884185 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/828d62db-71ec-4b5b-b36f-70124c98ab1b-cni-binary-copy\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.892953 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.885723 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bd94de2c-a623-493f-b731-6e8e75d73ccb-etc-tuned\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.892953 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:46.886865 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:46.892953 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:46.886886 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:46.892953 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:46.886904 2575 projected.go:194] Error preparing data for projected volume kube-api-access-p6ms7 for pod openshift-network-diagnostics/network-check-target-xg9qw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:46.892953 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.887006 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd94de2c-a623-493f-b731-6e8e75d73ccb-tmp\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.892953 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:46.887113 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7 podName:5664213f-7ddf-424a-af58-77eee7d05998 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:47.38709197 +0000 UTC m=+3.094314390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-p6ms7" (UniqueName: "kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7") pod "network-check-target-xg9qw" (UID: "5664213f-7ddf-424a-af58-77eee7d05998") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:46.892953 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.889202 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzvhd\" (UniqueName: \"kubernetes.io/projected/90bf1722-06bc-4b40-aeb1-0f3d321b412b-kube-api-access-jzvhd\") pod \"multus-additional-cni-plugins-krfvd\" (UID: \"90bf1722-06bc-4b40-aeb1-0f3d321b412b\") " pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:46.892953 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.890284 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fttrg\" (UniqueName: \"kubernetes.io/projected/bd94de2c-a623-493f-b731-6e8e75d73ccb-kube-api-access-fttrg\") pod \"tuned-vgxcc\" (UID: \"bd94de2c-a623-493f-b731-6e8e75d73ccb\") " pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:46.892953 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.890450 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf6xz\" (UniqueName: \"kubernetes.io/projected/261a5574-d8e0-4ff1-99d3-ac73ec229f92-kube-api-access-rf6xz\") pod \"network-metrics-daemon-sx6lq\" (UID: \"261a5574-d8e0-4ff1-99d3-ac73ec229f92\") " pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:36:46.892953 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.890698 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdf6h\" (UniqueName: \"kubernetes.io/projected/2b064dca-dd29-4d80-aaba-ffe324b856eb-kube-api-access-sdf6h\") pod \"node-ca-smstv\" (UID: \"2b064dca-dd29-4d80-aaba-ffe324b856eb\") " pod="openshift-image-registry/node-ca-smstv" Apr 22 18:36:46.892953 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.891883 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r267j\" (UniqueName: \"kubernetes.io/projected/6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17-kube-api-access-r267j\") pod \"node-resolver-vnqvq\" (UID: \"6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17\") " pod="openshift-dns/node-resolver-vnqvq" Apr 22 18:36:46.892953 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.892149 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74czq\" (UniqueName: \"kubernetes.io/projected/828d62db-71ec-4b5b-b36f-70124c98ab1b-kube-api-access-74czq\") pod \"multus-qjb4s\" (UID: \"828d62db-71ec-4b5b-b36f-70124c98ab1b\") " pod="openshift-multus/multus-qjb4s" Apr 22 18:36:46.893584 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.892988 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjjq2\" (UniqueName: \"kubernetes.io/projected/fb9d517f-20af-4295-b872-990841be4bb8-kube-api-access-jjjq2\") pod \"iptables-alerter-f4ls2\" (UID: \"fb9d517f-20af-4295-b872-990841be4bb8\") " pod="openshift-network-operator/iptables-alerter-f4ls2" Apr 22 18:36:46.894588 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.894564 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bx9l\" (UniqueName: \"kubernetes.io/projected/f3e9b483-1e32-4a5e-b0d0-200cf509c4e6-kube-api-access-9bx9l\") pod \"aws-ebs-csi-driver-node-cvxjs\" (UID: \"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:46.935562 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.935526 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal" event={"ID":"83e556b964a4cb65f16be5aa8fc1307d","Type":"ContainerStarted","Data":"f1f6005dcac735b3be56a4ec07ec3d91d4432fec205965c46978f1870b519ae1"} Apr 22 18:36:46.983169 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983129 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-etc-openvswitch\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.983347 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-run-ovn\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.983347 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983261 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-etc-openvswitch\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.983347 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983328 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-run-ovn\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.983459 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983399 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-run-openvswitch\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.983459 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983421 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-node-log\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.983459 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983444 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-systemd-units\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.983649 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08a25179-34e4-4328-9e3a-dfcc4b1aa177-ovn-node-metrics-cert\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.983649 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983480 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88s5t\" (UniqueName: \"kubernetes.io/projected/08a25179-34e4-4328-9e3a-dfcc4b1aa177-kube-api-access-88s5t\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.983649 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983496 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-node-log\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.983649 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983525 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-systemd-units\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.983649 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-run-ovn-kubernetes\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.983649 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-cni-netd\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.983649 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983636 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e513b58f-d876-4d46-a313-2450da792164-agent-certs\") pod \"konnectivity-agent-tdjfh\" (UID: \"e513b58f-d876-4d46-a313-2450da792164\") " pod="kube-system/konnectivity-agent-tdjfh" Apr 22 18:36:46.983649 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983643 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-run-ovn-kubernetes\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983586 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-run-openvswitch\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-cni-bin\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983680 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-cni-netd\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-log-socket\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983715 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-cni-bin\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983742 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983760 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-log-socket\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08a25179-34e4-4328-9e3a-dfcc4b1aa177-ovnkube-config\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983800 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e513b58f-d876-4d46-a313-2450da792164-konnectivity-ca\") pod \"konnectivity-agent-tdjfh\" (UID: \"e513b58f-d876-4d46-a313-2450da792164\") " pod="kube-system/konnectivity-agent-tdjfh" Apr 22 18:36:46.984044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983810 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983855 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-run-netns\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-var-lib-openvswitch\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983906 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08a25179-34e4-4328-9e3a-dfcc4b1aa177-env-overrides\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983953 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08a25179-34e4-4328-9e3a-dfcc4b1aa177-ovnkube-script-lib\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983959 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-run-netns\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983961 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-var-lib-openvswitch\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.983980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-kubelet\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984778 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.984022 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-slash\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984778 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.984046 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-run-systemd\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984778 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.984132 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-slash\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984778 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.984141 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-run-systemd\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984778 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.984181 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08a25179-34e4-4328-9e3a-dfcc4b1aa177-host-kubelet\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984778 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.984357 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08a25179-34e4-4328-9e3a-dfcc4b1aa177-ovnkube-config\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984778 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.984362 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08a25179-34e4-4328-9e3a-dfcc4b1aa177-env-overrides\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.984778 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.984396 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e513b58f-d876-4d46-a313-2450da792164-konnectivity-ca\") pod \"konnectivity-agent-tdjfh\" (UID: \"e513b58f-d876-4d46-a313-2450da792164\") " pod="kube-system/konnectivity-agent-tdjfh" Apr 22 18:36:46.984778 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.984571 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08a25179-34e4-4328-9e3a-dfcc4b1aa177-ovnkube-script-lib\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.986438 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.986367 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08a25179-34e4-4328-9e3a-dfcc4b1aa177-ovn-node-metrics-cert\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:46.986631 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.986603 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e513b58f-d876-4d46-a313-2450da792164-agent-certs\") pod \"konnectivity-agent-tdjfh\" (UID: \"e513b58f-d876-4d46-a313-2450da792164\") " pod="kube-system/konnectivity-agent-tdjfh" Apr 22 18:36:46.993834 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:46.993808 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88s5t\" (UniqueName: \"kubernetes.io/projected/08a25179-34e4-4328-9e3a-dfcc4b1aa177-kube-api-access-88s5t\") pod \"ovnkube-node-zbqgd\" (UID: \"08a25179-34e4-4328-9e3a-dfcc4b1aa177\") " pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:47.066080 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.066037 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" Apr 22 18:36:47.075869 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.075841 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-krfvd" Apr 22 18:36:47.087167 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.087145 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-smstv" Apr 22 18:36:47.092817 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.092790 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" Apr 22 18:36:47.098485 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.098462 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vnqvq" Apr 22 18:36:47.105125 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.105100 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f4ls2" Apr 22 18:36:47.114824 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.114795 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qjb4s" Apr 22 18:36:47.122641 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.122556 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tdjfh" Apr 22 18:36:47.127391 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.127359 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:36:47.387663 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.387576 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs\") pod \"network-metrics-daemon-sx6lq\" (UID: \"261a5574-d8e0-4ff1-99d3-ac73ec229f92\") " pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:36:47.387663 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.387638 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ms7\" (UniqueName: \"kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7\") pod \"network-check-target-xg9qw\" (UID: \"5664213f-7ddf-424a-af58-77eee7d05998\") " pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:36:47.387882 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:47.387762 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:47.387882 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:47.387800 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:47.387882 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:47.387820 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:47.387882 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:47.387832 2575 projected.go:194] Error preparing data for projected volume kube-api-access-p6ms7 for pod openshift-network-diagnostics/network-check-target-xg9qw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:47.387882 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:47.387881 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs podName:261a5574-d8e0-4ff1-99d3-ac73ec229f92 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:48.387861312 +0000 UTC m=+4.095083721 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs") pod "network-metrics-daemon-sx6lq" (UID: "261a5574-d8e0-4ff1-99d3-ac73ec229f92") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:47.388111 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:47.387905 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7 podName:5664213f-7ddf-424a-af58-77eee7d05998 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:48.387892137 +0000 UTC m=+4.095114548 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-p6ms7" (UniqueName: "kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7") pod "network-check-target-xg9qw" (UID: "5664213f-7ddf-424a-af58-77eee7d05998") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:47.568331 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:47.568301 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode513b58f_d876_4d46_a313_2450da792164.slice/crio-2782b771458feb2bb7089ece4f8d52a1a871cc90c2afad7e1c6b11533bcf5946 WatchSource:0}: Error finding container 2782b771458feb2bb7089ece4f8d52a1a871cc90c2afad7e1c6b11533bcf5946: Status 404 returned error can't find the container with id 2782b771458feb2bb7089ece4f8d52a1a871cc90c2afad7e1c6b11533bcf5946 Apr 22 18:36:47.569324 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:47.569298 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd94de2c_a623_493f_b731_6e8e75d73ccb.slice/crio-d0b93a8f6f484cf3b4e4d6298bd437ebfa39e31d0dfa62447e59e52aa600f4dc WatchSource:0}: Error finding container d0b93a8f6f484cf3b4e4d6298bd437ebfa39e31d0dfa62447e59e52aa600f4dc: Status 404 returned error can't find the container with id d0b93a8f6f484cf3b4e4d6298bd437ebfa39e31d0dfa62447e59e52aa600f4dc Apr 22 18:36:47.570648 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:47.570445 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod828d62db_71ec_4b5b_b36f_70124c98ab1b.slice/crio-dccd1c3bf50dfca463dad77c3d1fd672dad5509de2796551eb99bab9d2b94cdb WatchSource:0}: Error finding container dccd1c3bf50dfca463dad77c3d1fd672dad5509de2796551eb99bab9d2b94cdb: Status 404 returned error can't find the container with id dccd1c3bf50dfca463dad77c3d1fd672dad5509de2796551eb99bab9d2b94cdb Apr 22 18:36:47.572675 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:47.571143 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fd27c1a_c1d0_44cd_b22e_2ac77ce9dd17.slice/crio-78aea5618ab92dd0c9fbcd3cfa4835f578dfbdaddcfefde5e963646a9addf122 WatchSource:0}: Error finding container 78aea5618ab92dd0c9fbcd3cfa4835f578dfbdaddcfefde5e963646a9addf122: Status 404 returned error can't find the container with id 78aea5618ab92dd0c9fbcd3cfa4835f578dfbdaddcfefde5e963646a9addf122 Apr 22 18:36:47.574875 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:47.574855 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08a25179_34e4_4328_9e3a_dfcc4b1aa177.slice/crio-5dfea9d087098e01241b9c5451e7fcd4bb4f85b2af4006b2e7e90463a99dc4e2 WatchSource:0}: Error finding container 5dfea9d087098e01241b9c5451e7fcd4bb4f85b2af4006b2e7e90463a99dc4e2: Status 404 returned error can't find the container with id 5dfea9d087098e01241b9c5451e7fcd4bb4f85b2af4006b2e7e90463a99dc4e2 Apr 22 18:36:47.575575 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:36:47.575550 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90bf1722_06bc_4b40_aeb1_0f3d321b412b.slice/crio-fa5ec5699b3493a8f39bdb09a07bd2895b46ac7a17d3b2b31dd0640d93d59098 WatchSource:0}: Error finding container fa5ec5699b3493a8f39bdb09a07bd2895b46ac7a17d3b2b31dd0640d93d59098: Status 404 returned error can't find the container with id fa5ec5699b3493a8f39bdb09a07bd2895b46ac7a17d3b2b31dd0640d93d59098 Apr 22 18:36:47.836838 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.836597 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:31:45 +0000 UTC" deadline="2027-11-20 00:56:44.893210876 +0000 UTC" Apr 22 18:36:47.836838 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.836833 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13830h19m57.056385634s" Apr 22 18:36:47.941852 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.941812 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" event={"ID":"08a25179-34e4-4328-9e3a-dfcc4b1aa177","Type":"ContainerStarted","Data":"5dfea9d087098e01241b9c5451e7fcd4bb4f85b2af4006b2e7e90463a99dc4e2"} Apr 22 18:36:47.944889 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.944853 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qjb4s" event={"ID":"828d62db-71ec-4b5b-b36f-70124c98ab1b","Type":"ContainerStarted","Data":"dccd1c3bf50dfca463dad77c3d1fd672dad5509de2796551eb99bab9d2b94cdb"} Apr 22 18:36:47.947190 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.947150 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" event={"ID":"bd94de2c-a623-493f-b731-6e8e75d73ccb","Type":"ContainerStarted","Data":"d0b93a8f6f484cf3b4e4d6298bd437ebfa39e31d0dfa62447e59e52aa600f4dc"} Apr 22 18:36:47.950300 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.950269 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tdjfh" event={"ID":"e513b58f-d876-4d46-a313-2450da792164","Type":"ContainerStarted","Data":"2782b771458feb2bb7089ece4f8d52a1a871cc90c2afad7e1c6b11533bcf5946"} Apr 22 18:36:47.951835 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.951809 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" event={"ID":"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6","Type":"ContainerStarted","Data":"f7e3425815da23736345326cdfa35de51bc2a285dd8f9a4aa24d1ed762953950"} Apr 22 18:36:47.955060 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.955025 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-smstv" event={"ID":"2b064dca-dd29-4d80-aaba-ffe324b856eb","Type":"ContainerStarted","Data":"60b6c5313967a3310865f0dd9522ead82668f0a546eb5e8602d77ffe4846a75a"} Apr 22 18:36:47.958494 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.958461 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vnqvq" event={"ID":"6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17","Type":"ContainerStarted","Data":"78aea5618ab92dd0c9fbcd3cfa4835f578dfbdaddcfefde5e963646a9addf122"} Apr 22 18:36:47.964278 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.964249 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-142.ec2.internal" event={"ID":"8c3f3197044afe14bb61b56250727776","Type":"ContainerStarted","Data":"84820af82dff3ac3c631407cbfdeba09fa030d9953979fb27fdf92e912a068e8"} Apr 22 18:36:47.965988 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.965958 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f4ls2" event={"ID":"fb9d517f-20af-4295-b872-990841be4bb8","Type":"ContainerStarted","Data":"60170d5ec90aee8c05a99761bdebb7e5102fbd8351a1e23954d2c8418ccf6a20"} Apr 22 18:36:47.967312 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.967192 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krfvd" event={"ID":"90bf1722-06bc-4b40-aeb1-0f3d321b412b","Type":"ContainerStarted","Data":"fa5ec5699b3493a8f39bdb09a07bd2895b46ac7a17d3b2b31dd0640d93d59098"} Apr 22 18:36:47.982478 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:47.982417 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-142.ec2.internal" podStartSLOduration=1.982397369 podStartE2EDuration="1.982397369s" podCreationTimestamp="2026-04-22 18:36:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:36:47.982240361 +0000 UTC m=+3.689462792" watchObservedRunningTime="2026-04-22 18:36:47.982397369 +0000 UTC m=+3.689619777" Apr 22 18:36:48.396194 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:48.395053 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs\") pod \"network-metrics-daemon-sx6lq\" (UID: \"261a5574-d8e0-4ff1-99d3-ac73ec229f92\") " pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:36:48.396194 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:48.395137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ms7\" (UniqueName: \"kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7\") pod \"network-check-target-xg9qw\" (UID: \"5664213f-7ddf-424a-af58-77eee7d05998\") " pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:36:48.396194 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:48.395286 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:48.396194 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:48.395305 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:48.396194 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:48.395318 2575 projected.go:194] Error preparing data for projected volume kube-api-access-p6ms7 for pod openshift-network-diagnostics/network-check-target-xg9qw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:48.396194 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:48.395377 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7 podName:5664213f-7ddf-424a-af58-77eee7d05998 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:50.395358931 +0000 UTC m=+6.102581349 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-p6ms7" (UniqueName: "kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7") pod "network-check-target-xg9qw" (UID: "5664213f-7ddf-424a-af58-77eee7d05998") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:48.396194 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:48.395777 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:48.396194 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:48.395827 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs podName:261a5574-d8e0-4ff1-99d3-ac73ec229f92 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:50.395811398 +0000 UTC m=+6.103033808 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs") pod "network-metrics-daemon-sx6lq" (UID: "261a5574-d8e0-4ff1-99d3-ac73ec229f92") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:48.930596 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:48.930514 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:36:48.931057 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:48.930648 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:36:48.931139 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:48.931121 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:36:48.931248 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:48.931226 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:36:48.991169 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:48.991096 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal" event={"ID":"83e556b964a4cb65f16be5aa8fc1307d","Type":"ContainerDied","Data":"4e608edf968d0f110b5c93b2ce710bbad076367d412488e0a7ccc5ac0bb196f7"} Apr 22 18:36:48.992106 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:48.990912 2575 generic.go:358] "Generic (PLEG): container finished" podID="83e556b964a4cb65f16be5aa8fc1307d" containerID="4e608edf968d0f110b5c93b2ce710bbad076367d412488e0a7ccc5ac0bb196f7" exitCode=0 Apr 22 18:36:50.018507 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:50.017767 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal" event={"ID":"83e556b964a4cb65f16be5aa8fc1307d","Type":"ContainerStarted","Data":"86fb92c6206bf611d34415b0de69143530144c2141012e9e37b6dc433065b7ca"} Apr 22 18:36:50.416259 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:50.416215 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ms7\" (UniqueName: \"kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7\") pod \"network-check-target-xg9qw\" (UID: \"5664213f-7ddf-424a-af58-77eee7d05998\") " pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:36:50.416434 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:50.416301 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs\") pod \"network-metrics-daemon-sx6lq\" (UID: \"261a5574-d8e0-4ff1-99d3-ac73ec229f92\") " pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:36:50.416497 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:50.416430 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:50.416497 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:50.416495 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs podName:261a5574-d8e0-4ff1-99d3-ac73ec229f92 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:54.416475865 +0000 UTC m=+10.123698291 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs") pod "network-metrics-daemon-sx6lq" (UID: "261a5574-d8e0-4ff1-99d3-ac73ec229f92") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:50.416898 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:50.416877 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:50.417080 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:50.416903 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:50.417080 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:50.416915 2575 projected.go:194] Error preparing data for projected volume kube-api-access-p6ms7 for pod openshift-network-diagnostics/network-check-target-xg9qw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:50.417080 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:50.416983 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7 podName:5664213f-7ddf-424a-af58-77eee7d05998 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:54.416965776 +0000 UTC m=+10.124188185 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-p6ms7" (UniqueName: "kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7") pod "network-check-target-xg9qw" (UID: "5664213f-7ddf-424a-af58-77eee7d05998") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:50.939826 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:50.938524 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:36:50.939826 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:50.938712 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:36:50.939826 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:50.939575 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:36:50.939826 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:50.939763 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:36:52.931300 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:52.930763 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:36:52.931300 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:52.930910 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:36:52.931831 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:52.931414 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:36:52.931831 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:52.931532 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:36:54.452861 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:54.452820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs\") pod \"network-metrics-daemon-sx6lq\" (UID: \"261a5574-d8e0-4ff1-99d3-ac73ec229f92\") " pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:36:54.453414 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:54.452885 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ms7\" (UniqueName: \"kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7\") pod \"network-check-target-xg9qw\" (UID: \"5664213f-7ddf-424a-af58-77eee7d05998\") " pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:36:54.453414 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:54.453052 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:54.453414 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:54.453071 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:54.453414 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:54.453086 2575 projected.go:194] Error preparing data for projected volume kube-api-access-p6ms7 for pod openshift-network-diagnostics/network-check-target-xg9qw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:54.453414 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:54.453152 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7 podName:5664213f-7ddf-424a-af58-77eee7d05998 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:02.453130442 +0000 UTC m=+18.160352861 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-p6ms7" (UniqueName: "kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7") pod "network-check-target-xg9qw" (UID: "5664213f-7ddf-424a-af58-77eee7d05998") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:54.453683 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:54.453567 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:54.453683 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:54.453622 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs podName:261a5574-d8e0-4ff1-99d3-ac73ec229f92 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:02.453603261 +0000 UTC m=+18.160825672 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs") pod "network-metrics-daemon-sx6lq" (UID: "261a5574-d8e0-4ff1-99d3-ac73ec229f92") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:54.930138 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:54.930058 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:36:54.930310 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:54.930179 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:36:54.930538 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:54.930515 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:36:54.931718 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:54.931687 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:36:56.929488 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:56.929453 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:36:56.929960 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:56.929465 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:36:56.929960 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:56.929612 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:36:56.929960 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:56.929727 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:36:58.930044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:58.930000 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:36:58.930455 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:36:58.930003 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:36:58.930455 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:58.930164 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:36:58.930455 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:36:58.930223 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:37:00.930015 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:00.929975 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:00.930494 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:00.930106 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:37:00.930494 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:00.929972 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:00.930619 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:00.930540 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:37:02.514111 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:02.514070 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ms7\" (UniqueName: \"kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7\") pod \"network-check-target-xg9qw\" (UID: \"5664213f-7ddf-424a-af58-77eee7d05998\") " pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:02.514601 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:02.514138 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs\") pod \"network-metrics-daemon-sx6lq\" (UID: \"261a5574-d8e0-4ff1-99d3-ac73ec229f92\") " pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:02.514601 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:02.514267 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:02.514601 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:02.514331 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs podName:261a5574-d8e0-4ff1-99d3-ac73ec229f92 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:18.514312191 +0000 UTC m=+34.221534597 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs") pod "network-metrics-daemon-sx6lq" (UID: "261a5574-d8e0-4ff1-99d3-ac73ec229f92") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:02.514601 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:02.514267 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:37:02.514601 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:02.514367 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:37:02.514601 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:02.514383 2575 projected.go:194] Error preparing data for projected volume kube-api-access-p6ms7 for pod openshift-network-diagnostics/network-check-target-xg9qw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:02.514601 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:02.514455 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7 podName:5664213f-7ddf-424a-af58-77eee7d05998 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:18.514438593 +0000 UTC m=+34.221661018 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-p6ms7" (UniqueName: "kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7") pod "network-check-target-xg9qw" (UID: "5664213f-7ddf-424a-af58-77eee7d05998") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:02.929521 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:02.929471 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:02.929712 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:02.929479 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:02.929712 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:02.929614 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:37:02.929712 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:02.929697 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:37:04.937475 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:04.936743 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:04.937475 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:04.936857 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:37:04.937475 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:04.937325 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:04.937475 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:04.937432 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:37:05.047627 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:05.047283 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qjb4s" event={"ID":"828d62db-71ec-4b5b-b36f-70124c98ab1b","Type":"ContainerStarted","Data":"e5d8685087581266230400ad834e7d87fbf548ab3e01ad0507fd0854ccbbc9c9"} Apr 22 18:37:05.051406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:05.051372 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" event={"ID":"bd94de2c-a623-493f-b731-6e8e75d73ccb","Type":"ContainerStarted","Data":"7d89028d88a933690bb45d7ddc395f8ce2622e6e77811b4310c7581298351e95"} Apr 22 18:37:05.053765 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:05.053724 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" event={"ID":"08a25179-34e4-4328-9e3a-dfcc4b1aa177","Type":"ContainerStarted","Data":"e3205fbdc20b21414821c1c3fb47f492a7dbe83e041dc8916bba923aaa399da6"} Apr 22 18:37:05.070639 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:05.069511 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-142.ec2.internal" podStartSLOduration=19.069478134 podStartE2EDuration="19.069478134s" podCreationTimestamp="2026-04-22 18:36:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:36:50.039051933 +0000 UTC m=+5.746274364" watchObservedRunningTime="2026-04-22 18:37:05.069478134 +0000 UTC m=+20.776700563" Apr 22 18:37:05.098647 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:05.098607 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qjb4s" podStartSLOduration=2.8603551769999997 podStartE2EDuration="20.09859378s" podCreationTimestamp="2026-04-22 18:36:45 +0000 UTC" firstStartedPulling="2026-04-22 18:36:47.572767285 +0000 UTC m=+3.279989697" lastFinishedPulling="2026-04-22 18:37:04.811005892 +0000 UTC m=+20.518228300" observedRunningTime="2026-04-22 18:37:05.070364435 +0000 UTC m=+20.777586867" watchObservedRunningTime="2026-04-22 18:37:05.09859378 +0000 UTC m=+20.805816208" Apr 22 18:37:05.099107 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:05.098893 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vgxcc" podStartSLOduration=3.906448738 podStartE2EDuration="21.09888156s" podCreationTimestamp="2026-04-22 18:36:44 +0000 UTC" firstStartedPulling="2026-04-22 18:36:47.571718997 +0000 UTC m=+3.278941406" lastFinishedPulling="2026-04-22 18:37:04.764151803 +0000 UTC m=+20.471374228" observedRunningTime="2026-04-22 18:37:05.098248272 +0000 UTC m=+20.805470701" watchObservedRunningTime="2026-04-22 18:37:05.09888156 +0000 UTC m=+20.806103990" Apr 22 18:37:05.955523 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:05.955350 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:37:06.056803 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.056767 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" event={"ID":"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6","Type":"ContainerStarted","Data":"4ad6e945a00d8992e3d1745bbbd435fc4f3eadb074b3fe8465476a3d9db3040e"} Apr 22 18:37:06.056803 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.056810 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" event={"ID":"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6","Type":"ContainerStarted","Data":"c53cc371f445e28650c1c754f81b5ff1fd8a5c1288b8c4154a79b99aab20a15f"} Apr 22 18:37:06.057871 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.057850 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-smstv" event={"ID":"2b064dca-dd29-4d80-aaba-ffe324b856eb","Type":"ContainerStarted","Data":"161ea6d2387c7704a227f7f27d11931f1dc5230004e164f964bb2300edaae2e0"} Apr 22 18:37:06.059086 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.059064 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vnqvq" event={"ID":"6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17","Type":"ContainerStarted","Data":"cfc5d62801850948fd10bd2d9eb9aa9d43c7b1012f330eb13b6c8ed65b01faff"} Apr 22 18:37:06.060219 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.060200 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f4ls2" event={"ID":"fb9d517f-20af-4295-b872-990841be4bb8","Type":"ContainerStarted","Data":"fa0b31f996560d498e01f32f29233c070c26b98cca1a7d81dd7e07dc25044d28"} Apr 22 18:37:06.061451 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.061429 2575 generic.go:358] "Generic (PLEG): container finished" podID="90bf1722-06bc-4b40-aeb1-0f3d321b412b" containerID="8d897170a05aa0c14db9077e27a9fd334ce3ee92c9d76a5536ec099c2f818e3e" exitCode=0 Apr 22 18:37:06.061526 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.061502 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krfvd" event={"ID":"90bf1722-06bc-4b40-aeb1-0f3d321b412b","Type":"ContainerDied","Data":"8d897170a05aa0c14db9077e27a9fd334ce3ee92c9d76a5536ec099c2f818e3e"} Apr 22 18:37:06.064186 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.064163 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" event={"ID":"08a25179-34e4-4328-9e3a-dfcc4b1aa177","Type":"ContainerStarted","Data":"5dd658e8bf7e03518a96ac91ba2d28c86259218ab3ca2d219484084a02ce2367"} Apr 22 18:37:06.064269 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.064190 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" event={"ID":"08a25179-34e4-4328-9e3a-dfcc4b1aa177","Type":"ContainerStarted","Data":"2434315fa988820b72013362245ef64e5797c95e6848c0b15dd7c3555d78be90"} Apr 22 18:37:06.064269 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.064199 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" event={"ID":"08a25179-34e4-4328-9e3a-dfcc4b1aa177","Type":"ContainerStarted","Data":"207a40ffc1e880140c68ec17db19e4551236217f3a2371e2b12c5b047a8067f9"} Apr 22 18:37:06.064269 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.064207 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" event={"ID":"08a25179-34e4-4328-9e3a-dfcc4b1aa177","Type":"ContainerStarted","Data":"fd15ea2cae195c9d8da0ab26c695cbf2d605df91369615500d43550b9d5f181b"} Apr 22 18:37:06.064269 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.064215 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" event={"ID":"08a25179-34e4-4328-9e3a-dfcc4b1aa177","Type":"ContainerStarted","Data":"e44da6c794205d1b2b26155c11144f316f6639aa6894cd7d36f79b2669c48b66"} Apr 22 18:37:06.065238 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.065215 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tdjfh" event={"ID":"e513b58f-d876-4d46-a313-2450da792164","Type":"ContainerStarted","Data":"3cf898ae30e342cc853d00636b19d0532e9df94ca73de8db01ceb456664b0a0e"} Apr 22 18:37:06.076566 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.076488 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-smstv" podStartSLOduration=4.888602256 podStartE2EDuration="22.076476074s" podCreationTimestamp="2026-04-22 18:36:44 +0000 UTC" firstStartedPulling="2026-04-22 18:36:47.579203136 +0000 UTC m=+3.286425542" lastFinishedPulling="2026-04-22 18:37:04.76707695 +0000 UTC m=+20.474299360" observedRunningTime="2026-04-22 18:37:06.076208598 +0000 UTC m=+21.783431025" watchObservedRunningTime="2026-04-22 18:37:06.076476074 +0000 UTC m=+21.783698502" Apr 22 18:37:06.127173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.127123 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-tdjfh" podStartSLOduration=3.930805432 podStartE2EDuration="21.127107882s" podCreationTimestamp="2026-04-22 18:36:45 +0000 UTC" firstStartedPulling="2026-04-22 18:36:47.570523289 +0000 UTC m=+3.277745715" lastFinishedPulling="2026-04-22 18:37:04.766825743 +0000 UTC m=+20.474048165" observedRunningTime="2026-04-22 18:37:06.126861285 +0000 UTC m=+21.834083710" watchObservedRunningTime="2026-04-22 18:37:06.127107882 +0000 UTC m=+21.834330359" Apr 22 18:37:06.145143 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.145078 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vnqvq" podStartSLOduration=3.953580206 podStartE2EDuration="21.145059481s" podCreationTimestamp="2026-04-22 18:36:45 +0000 UTC" firstStartedPulling="2026-04-22 18:36:47.57557677 +0000 UTC m=+3.282799189" lastFinishedPulling="2026-04-22 18:37:04.767056055 +0000 UTC m=+20.474278464" observedRunningTime="2026-04-22 18:37:06.14498908 +0000 UTC m=+21.852211509" watchObservedRunningTime="2026-04-22 18:37:06.145059481 +0000 UTC m=+21.852281908" Apr 22 18:37:06.163608 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.163553 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-f4ls2" podStartSLOduration=3.933696911 podStartE2EDuration="21.163537754s" podCreationTimestamp="2026-04-22 18:36:45 +0000 UTC" firstStartedPulling="2026-04-22 18:36:47.580504019 +0000 UTC m=+3.287726434" lastFinishedPulling="2026-04-22 18:37:04.810344859 +0000 UTC m=+20.517567277" observedRunningTime="2026-04-22 18:37:06.163296874 +0000 UTC m=+21.870519302" watchObservedRunningTime="2026-04-22 18:37:06.163537754 +0000 UTC m=+21.870760176" Apr 22 18:37:06.840544 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.840438 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:37:05.955517055Z","UUID":"7ac33891-4ac5-470c-86b1-9e725c34b078","Handler":null,"Name":"","Endpoint":""} Apr 22 18:37:06.842203 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.842179 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:37:06.842352 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.842213 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:37:06.929415 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.929350 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:06.929534 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:06.929449 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:37:06.930318 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:06.930296 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:06.930432 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:06.930401 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:37:07.069419 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:07.069329 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" event={"ID":"f3e9b483-1e32-4a5e-b0d0-200cf509c4e6","Type":"ContainerStarted","Data":"85b6e9c3a3c79685210393dea0ad2f4b691198c17187c2e3e38282e2869c4314"} Apr 22 18:37:07.095859 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:07.095801 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cvxjs" podStartSLOduration=3.8740071179999997 podStartE2EDuration="23.095785486s" podCreationTimestamp="2026-04-22 18:36:44 +0000 UTC" firstStartedPulling="2026-04-22 18:36:47.581838595 +0000 UTC m=+3.289061005" lastFinishedPulling="2026-04-22 18:37:06.803616967 +0000 UTC m=+22.510839373" observedRunningTime="2026-04-22 18:37:07.095104469 +0000 UTC m=+22.802326889" watchObservedRunningTime="2026-04-22 18:37:07.095785486 +0000 UTC m=+22.803007914" Apr 22 18:37:08.074999 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:08.074959 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" event={"ID":"08a25179-34e4-4328-9e3a-dfcc4b1aa177","Type":"ContainerStarted","Data":"994108074be276e1a750cf058e512eedb3b64b7460369f16962b3a68643a3efc"} Apr 22 18:37:08.314146 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:08.313956 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-tdjfh" Apr 22 18:37:08.929905 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:08.929858 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:08.929905 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:08.929901 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:08.930214 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:08.930016 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:37:08.930214 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:08.930152 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:37:10.929452 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:10.929280 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:10.930127 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:10.929344 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:10.930127 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:10.929550 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:37:10.930127 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:10.929618 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:37:11.014199 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:11.014158 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-tdjfh" Apr 22 18:37:11.014747 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:11.014728 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-tdjfh" Apr 22 18:37:11.083170 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:11.083136 2575 generic.go:358] "Generic (PLEG): container finished" podID="90bf1722-06bc-4b40-aeb1-0f3d321b412b" containerID="d36c2c8a3a2c0d96b468434b7f75e6e1b4882448c227a10eb0e7eb9939d785a7" exitCode=0 Apr 22 18:37:11.083342 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:11.083205 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krfvd" event={"ID":"90bf1722-06bc-4b40-aeb1-0f3d321b412b","Type":"ContainerDied","Data":"d36c2c8a3a2c0d96b468434b7f75e6e1b4882448c227a10eb0e7eb9939d785a7"} Apr 22 18:37:11.086621 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:11.086591 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" event={"ID":"08a25179-34e4-4328-9e3a-dfcc4b1aa177","Type":"ContainerStarted","Data":"81cd7a45ef252c0864e6fdfd3854755ac7d580d9a525739c2949a5f8ed4b934d"} Apr 22 18:37:11.087305 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:11.087288 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-tdjfh" Apr 22 18:37:11.142129 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:11.142078 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" podStartSLOduration=8.841971555 podStartE2EDuration="26.142063297s" podCreationTimestamp="2026-04-22 18:36:45 +0000 UTC" firstStartedPulling="2026-04-22 18:36:47.578075241 +0000 UTC m=+3.285297660" lastFinishedPulling="2026-04-22 18:37:04.878166996 +0000 UTC m=+20.585389402" observedRunningTime="2026-04-22 18:37:11.141876884 +0000 UTC m=+26.849099328" watchObservedRunningTime="2026-04-22 18:37:11.142063297 +0000 UTC m=+26.849285725" Apr 22 18:37:12.089368 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:12.089337 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:37:12.089832 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:12.089380 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:37:12.089832 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:12.089395 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:37:12.104598 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:12.104571 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:37:12.104814 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:12.104638 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:37:12.929781 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:12.929740 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:12.929992 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:12.929788 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:12.929992 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:12.929861 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:37:12.930080 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:12.930010 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:37:13.093001 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:13.092967 2575 generic.go:358] "Generic (PLEG): container finished" podID="90bf1722-06bc-4b40-aeb1-0f3d321b412b" containerID="fe5cfbb7800401865dcf9f48ab4468887badd632f202b65cafce230135036be0" exitCode=0 Apr 22 18:37:13.093391 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:13.093049 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krfvd" event={"ID":"90bf1722-06bc-4b40-aeb1-0f3d321b412b","Type":"ContainerDied","Data":"fe5cfbb7800401865dcf9f48ab4468887badd632f202b65cafce230135036be0"} Apr 22 18:37:14.929952 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:14.929901 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:14.930323 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:14.930018 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:37:14.930323 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:14.930078 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:14.930323 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:14.930165 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:37:15.098797 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:15.098764 2575 generic.go:358] "Generic (PLEG): container finished" podID="90bf1722-06bc-4b40-aeb1-0f3d321b412b" containerID="364d030a951e7de2d57669643b626ba674a5dea10f46433db05dd26290ccda8a" exitCode=0 Apr 22 18:37:15.098997 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:15.098833 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krfvd" event={"ID":"90bf1722-06bc-4b40-aeb1-0f3d321b412b","Type":"ContainerDied","Data":"364d030a951e7de2d57669643b626ba674a5dea10f46433db05dd26290ccda8a"} Apr 22 18:37:16.929185 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:16.929147 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:16.929630 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:16.929199 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:16.929630 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:16.929294 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:37:16.929630 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:16.929441 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:37:18.538610 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:18.538571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ms7\" (UniqueName: \"kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7\") pod \"network-check-target-xg9qw\" (UID: \"5664213f-7ddf-424a-af58-77eee7d05998\") " pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:18.539091 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:18.538633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs\") pod \"network-metrics-daemon-sx6lq\" (UID: \"261a5574-d8e0-4ff1-99d3-ac73ec229f92\") " pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:18.539091 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:18.538730 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:37:18.539091 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:18.538747 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:18.539091 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:18.538753 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:37:18.539091 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:18.538765 2575 projected.go:194] Error preparing data for projected volume kube-api-access-p6ms7 for pod openshift-network-diagnostics/network-check-target-xg9qw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:18.539091 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:18.538812 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7 podName:5664213f-7ddf-424a-af58-77eee7d05998 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:50.53879941 +0000 UTC m=+66.246021821 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-p6ms7" (UniqueName: "kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7") pod "network-check-target-xg9qw" (UID: "5664213f-7ddf-424a-af58-77eee7d05998") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:18.539091 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:18.538827 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs podName:261a5574-d8e0-4ff1-99d3-ac73ec229f92 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:50.538820359 +0000 UTC m=+66.246042765 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs") pod "network-metrics-daemon-sx6lq" (UID: "261a5574-d8e0-4ff1-99d3-ac73ec229f92") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:18.929966 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:18.929910 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:18.929966 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:18.929960 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:18.930211 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:18.930061 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:37:18.930211 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:18.930185 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:37:20.929738 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:20.929698 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:20.930207 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:20.929822 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:37:20.930207 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:20.929835 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:20.930207 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:20.929974 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:37:21.213763 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:21.213517 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sx6lq"] Apr 22 18:37:21.213965 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:21.213875 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:21.214057 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:21.214022 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:37:21.215914 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:21.215644 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xg9qw"] Apr 22 18:37:21.215914 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:21.215717 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:21.215914 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:21.215789 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:37:22.115161 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:22.115125 2575 generic.go:358] "Generic (PLEG): container finished" podID="90bf1722-06bc-4b40-aeb1-0f3d321b412b" containerID="16eb543bf01fecb3c2ec5d5da28932e396a8ba011cd39c16ca785400a55ba2d1" exitCode=0 Apr 22 18:37:22.115710 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:22.115185 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krfvd" event={"ID":"90bf1722-06bc-4b40-aeb1-0f3d321b412b","Type":"ContainerDied","Data":"16eb543bf01fecb3c2ec5d5da28932e396a8ba011cd39c16ca785400a55ba2d1"} Apr 22 18:37:22.929452 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:22.929403 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:22.929657 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:22.929524 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx6lq" podUID="261a5574-d8e0-4ff1-99d3-ac73ec229f92" Apr 22 18:37:22.929657 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:22.929585 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:22.929769 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:22.929684 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg9qw" podUID="5664213f-7ddf-424a-af58-77eee7d05998" Apr 22 18:37:23.122101 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:23.122065 2575 generic.go:358] "Generic (PLEG): container finished" podID="90bf1722-06bc-4b40-aeb1-0f3d321b412b" containerID="5c5a389bc8fd90828201f80000d45f23770eed0f6cdbed21b8c62c6dc8dfab8a" exitCode=0 Apr 22 18:37:23.122474 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:23.122134 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krfvd" event={"ID":"90bf1722-06bc-4b40-aeb1-0f3d321b412b","Type":"ContainerDied","Data":"5c5a389bc8fd90828201f80000d45f23770eed0f6cdbed21b8c62c6dc8dfab8a"} Apr 22 18:37:24.127044 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.127004 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krfvd" event={"ID":"90bf1722-06bc-4b40-aeb1-0f3d321b412b","Type":"ContainerStarted","Data":"80888bcf78c6adbc6ea5d0d821adcb519f95a63308d3cfcbde355cb062187f71"} Apr 22 18:37:24.158063 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.157993 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-krfvd" podStartSLOduration=5.676148009 podStartE2EDuration="39.15797342s" podCreationTimestamp="2026-04-22 18:36:45 +0000 UTC" firstStartedPulling="2026-04-22 18:36:47.580056381 +0000 UTC m=+3.287278795" lastFinishedPulling="2026-04-22 18:37:21.061881799 +0000 UTC m=+36.769104206" observedRunningTime="2026-04-22 18:37:24.157817682 +0000 UTC m=+39.865040109" watchObservedRunningTime="2026-04-22 18:37:24.15797342 +0000 UTC m=+39.865195849" Apr 22 18:37:24.641764 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.641547 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-142.ec2.internal" event="NodeReady" Apr 22 18:37:24.641955 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.641862 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:37:24.702214 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.702181 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-ndm6t"] Apr 22 18:37:24.731867 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.731834 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fcfqp"] Apr 22 18:37:24.732063 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.732040 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ndm6t" Apr 22 18:37:24.735412 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.735383 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:37:24.735562 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.735457 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-7qsn6\"" Apr 22 18:37:24.735562 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.735496 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:37:24.735659 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.735620 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:37:24.736790 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.736772 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:37:24.745324 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.745297 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fcfqp"] Apr 22 18:37:24.745467 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.745456 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fcfqp" Apr 22 18:37:24.748755 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.748735 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:37:24.748907 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.748891 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:37:24.757709 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.757644 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:37:24.758355 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.758340 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-x8r7b\"" Apr 22 18:37:24.760545 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.760519 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ndm6t"] Apr 22 18:37:24.810006 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.809970 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nlsz6"] Apr 22 18:37:24.831352 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.831321 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nlsz6" Apr 22 18:37:24.833814 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.833783 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nlsz6"] Apr 22 18:37:24.835451 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.835424 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:37:24.835558 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.835512 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xfhxc\"" Apr 22 18:37:24.835558 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.835513 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:37:24.883117 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.883090 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d86ab162-e73e-453a-99c0-3c1735a2db66-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ndm6t\" (UID: \"d86ab162-e73e-453a-99c0-3c1735a2db66\") " pod="openshift-insights/insights-runtime-extractor-ndm6t" Apr 22 18:37:24.883117 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.883123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d86ab162-e73e-453a-99c0-3c1735a2db66-data-volume\") pod \"insights-runtime-extractor-ndm6t\" (UID: \"d86ab162-e73e-453a-99c0-3c1735a2db66\") " pod="openshift-insights/insights-runtime-extractor-ndm6t" Apr 22 18:37:24.883344 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.883153 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhtr9\" (UniqueName: \"kubernetes.io/projected/9a0dbd88-fa39-4383-b147-1af4bcb23b83-kube-api-access-lhtr9\") pod \"ingress-canary-fcfqp\" (UID: \"9a0dbd88-fa39-4383-b147-1af4bcb23b83\") " pod="openshift-ingress-canary/ingress-canary-fcfqp" Apr 22 18:37:24.883344 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.883183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d86ab162-e73e-453a-99c0-3c1735a2db66-crio-socket\") pod \"insights-runtime-extractor-ndm6t\" (UID: \"d86ab162-e73e-453a-99c0-3c1735a2db66\") " pod="openshift-insights/insights-runtime-extractor-ndm6t" Apr 22 18:37:24.883344 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.883228 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4mnp\" (UniqueName: \"kubernetes.io/projected/d86ab162-e73e-453a-99c0-3c1735a2db66-kube-api-access-t4mnp\") pod \"insights-runtime-extractor-ndm6t\" (UID: \"d86ab162-e73e-453a-99c0-3c1735a2db66\") " pod="openshift-insights/insights-runtime-extractor-ndm6t" Apr 22 18:37:24.883344 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.883261 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a0dbd88-fa39-4383-b147-1af4bcb23b83-cert\") pod \"ingress-canary-fcfqp\" (UID: \"9a0dbd88-fa39-4383-b147-1af4bcb23b83\") " pod="openshift-ingress-canary/ingress-canary-fcfqp" Apr 22 18:37:24.883344 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.883284 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d86ab162-e73e-453a-99c0-3c1735a2db66-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ndm6t\" (UID: \"d86ab162-e73e-453a-99c0-3c1735a2db66\") " pod="openshift-insights/insights-runtime-extractor-ndm6t" Apr 22 18:37:24.930742 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.930707 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:24.930946 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.930910 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:24.934247 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.934227 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-95dwn\"" Apr 22 18:37:24.934355 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.934318 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-zwsxn\"" Apr 22 18:37:24.934355 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.934337 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:37:24.934440 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.934416 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:37:24.934546 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.934530 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:37:24.983937 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.983897 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d86ab162-e73e-453a-99c0-3c1735a2db66-data-volume\") pod \"insights-runtime-extractor-ndm6t\" (UID: \"d86ab162-e73e-453a-99c0-3c1735a2db66\") " pod="openshift-insights/insights-runtime-extractor-ndm6t" Apr 22 18:37:24.984114 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.983967 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhtr9\" (UniqueName: \"kubernetes.io/projected/9a0dbd88-fa39-4383-b147-1af4bcb23b83-kube-api-access-lhtr9\") pod \"ingress-canary-fcfqp\" (UID: \"9a0dbd88-fa39-4383-b147-1af4bcb23b83\") " pod="openshift-ingress-canary/ingress-canary-fcfqp" Apr 22 18:37:24.984114 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.983990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d86ab162-e73e-453a-99c0-3c1735a2db66-crio-socket\") pod \"insights-runtime-extractor-ndm6t\" (UID: \"d86ab162-e73e-453a-99c0-3c1735a2db66\") " pod="openshift-insights/insights-runtime-extractor-ndm6t" Apr 22 18:37:24.984114 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.984012 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43bb7de4-00c0-4077-a922-4be165674fa5-config-volume\") pod \"dns-default-nlsz6\" (UID: \"43bb7de4-00c0-4077-a922-4be165674fa5\") " pod="openshift-dns/dns-default-nlsz6" Apr 22 18:37:24.984114 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.984040 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/43bb7de4-00c0-4077-a922-4be165674fa5-tmp-dir\") pod \"dns-default-nlsz6\" (UID: \"43bb7de4-00c0-4077-a922-4be165674fa5\") " pod="openshift-dns/dns-default-nlsz6" Apr 22 18:37:24.984114 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.984062 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4mnp\" (UniqueName: \"kubernetes.io/projected/d86ab162-e73e-453a-99c0-3c1735a2db66-kube-api-access-t4mnp\") pod \"insights-runtime-extractor-ndm6t\" (UID: \"d86ab162-e73e-453a-99c0-3c1735a2db66\") " pod="openshift-insights/insights-runtime-extractor-ndm6t" Apr 22 18:37:24.984114 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.984086 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/43bb7de4-00c0-4077-a922-4be165674fa5-metrics-tls\") pod \"dns-default-nlsz6\" (UID: \"43bb7de4-00c0-4077-a922-4be165674fa5\") " pod="openshift-dns/dns-default-nlsz6" Apr 22 18:37:24.984114 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.984108 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kplw9\" (UniqueName: \"kubernetes.io/projected/43bb7de4-00c0-4077-a922-4be165674fa5-kube-api-access-kplw9\") pod \"dns-default-nlsz6\" (UID: \"43bb7de4-00c0-4077-a922-4be165674fa5\") " pod="openshift-dns/dns-default-nlsz6" Apr 22 18:37:24.984456 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.984137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a0dbd88-fa39-4383-b147-1af4bcb23b83-cert\") pod \"ingress-canary-fcfqp\" (UID: \"9a0dbd88-fa39-4383-b147-1af4bcb23b83\") " pod="openshift-ingress-canary/ingress-canary-fcfqp" Apr 22 18:37:24.984456 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.984160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d86ab162-e73e-453a-99c0-3c1735a2db66-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ndm6t\" (UID: \"d86ab162-e73e-453a-99c0-3c1735a2db66\") " pod="openshift-insights/insights-runtime-extractor-ndm6t" Apr 22 18:37:24.984456 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.984205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d86ab162-e73e-453a-99c0-3c1735a2db66-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ndm6t\" (UID: \"d86ab162-e73e-453a-99c0-3c1735a2db66\") " pod="openshift-insights/insights-runtime-extractor-ndm6t" Apr 22 18:37:24.984456 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.984242 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d86ab162-e73e-453a-99c0-3c1735a2db66-crio-socket\") pod \"insights-runtime-extractor-ndm6t\" (UID: \"d86ab162-e73e-453a-99c0-3c1735a2db66\") " pod="openshift-insights/insights-runtime-extractor-ndm6t" Apr 22 18:37:24.984456 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.984270 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d86ab162-e73e-453a-99c0-3c1735a2db66-data-volume\") pod \"insights-runtime-extractor-ndm6t\" (UID: \"d86ab162-e73e-453a-99c0-3c1735a2db66\") " pod="openshift-insights/insights-runtime-extractor-ndm6t" Apr 22 18:37:24.984724 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.984705 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d86ab162-e73e-453a-99c0-3c1735a2db66-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ndm6t\" (UID: \"d86ab162-e73e-453a-99c0-3c1735a2db66\") " pod="openshift-insights/insights-runtime-extractor-ndm6t" Apr 22 18:37:24.988459 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.988433 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d86ab162-e73e-453a-99c0-3c1735a2db66-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ndm6t\" (UID: \"d86ab162-e73e-453a-99c0-3c1735a2db66\") " pod="openshift-insights/insights-runtime-extractor-ndm6t" Apr 22 18:37:24.997283 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:24.997255 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4mnp\" (UniqueName: \"kubernetes.io/projected/d86ab162-e73e-453a-99c0-3c1735a2db66-kube-api-access-t4mnp\") pod \"insights-runtime-extractor-ndm6t\" (UID: \"d86ab162-e73e-453a-99c0-3c1735a2db66\") " pod="openshift-insights/insights-runtime-extractor-ndm6t" Apr 22 18:37:25.003662 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:25.003634 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9a0dbd88-fa39-4383-b147-1af4bcb23b83-cert\") pod \"ingress-canary-fcfqp\" (UID: \"9a0dbd88-fa39-4383-b147-1af4bcb23b83\") " pod="openshift-ingress-canary/ingress-canary-fcfqp" Apr 22 18:37:25.011480 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:25.011396 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhtr9\" (UniqueName: \"kubernetes.io/projected/9a0dbd88-fa39-4383-b147-1af4bcb23b83-kube-api-access-lhtr9\") pod \"ingress-canary-fcfqp\" (UID: \"9a0dbd88-fa39-4383-b147-1af4bcb23b83\") " pod="openshift-ingress-canary/ingress-canary-fcfqp" Apr 22 18:37:25.041337 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:25.041305 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ndm6t" Apr 22 18:37:25.053648 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:25.053622 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fcfqp" Apr 22 18:37:25.085494 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:25.085458 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/43bb7de4-00c0-4077-a922-4be165674fa5-metrics-tls\") pod \"dns-default-nlsz6\" (UID: \"43bb7de4-00c0-4077-a922-4be165674fa5\") " pod="openshift-dns/dns-default-nlsz6" Apr 22 18:37:25.085494 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:25.085497 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kplw9\" (UniqueName: \"kubernetes.io/projected/43bb7de4-00c0-4077-a922-4be165674fa5-kube-api-access-kplw9\") pod \"dns-default-nlsz6\" (UID: \"43bb7de4-00c0-4077-a922-4be165674fa5\") " pod="openshift-dns/dns-default-nlsz6" Apr 22 18:37:25.085916 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:25.085854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43bb7de4-00c0-4077-a922-4be165674fa5-config-volume\") pod \"dns-default-nlsz6\" (UID: \"43bb7de4-00c0-4077-a922-4be165674fa5\") " pod="openshift-dns/dns-default-nlsz6" Apr 22 18:37:25.085916 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:25.085896 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/43bb7de4-00c0-4077-a922-4be165674fa5-tmp-dir\") pod \"dns-default-nlsz6\" (UID: \"43bb7de4-00c0-4077-a922-4be165674fa5\") " pod="openshift-dns/dns-default-nlsz6" Apr 22 18:37:25.086253 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:25.086232 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/43bb7de4-00c0-4077-a922-4be165674fa5-tmp-dir\") pod \"dns-default-nlsz6\" (UID: \"43bb7de4-00c0-4077-a922-4be165674fa5\") " pod="openshift-dns/dns-default-nlsz6" Apr 22 18:37:25.086948 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:25.086520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43bb7de4-00c0-4077-a922-4be165674fa5-config-volume\") pod \"dns-default-nlsz6\" (UID: \"43bb7de4-00c0-4077-a922-4be165674fa5\") " pod="openshift-dns/dns-default-nlsz6" Apr 22 18:37:25.088943 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:25.088906 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/43bb7de4-00c0-4077-a922-4be165674fa5-metrics-tls\") pod \"dns-default-nlsz6\" (UID: \"43bb7de4-00c0-4077-a922-4be165674fa5\") " pod="openshift-dns/dns-default-nlsz6" Apr 22 18:37:25.103528 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:25.103452 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kplw9\" (UniqueName: \"kubernetes.io/projected/43bb7de4-00c0-4077-a922-4be165674fa5-kube-api-access-kplw9\") pod \"dns-default-nlsz6\" (UID: \"43bb7de4-00c0-4077-a922-4be165674fa5\") " pod="openshift-dns/dns-default-nlsz6" Apr 22 18:37:25.142411 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:25.142383 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nlsz6" Apr 22 18:37:25.190830 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:25.190791 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fcfqp"] Apr 22 18:37:25.196575 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:37:25.196537 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a0dbd88_fa39_4383_b147_1af4bcb23b83.slice/crio-e3b363b521467483ab20bf591cf3ec1cd66341c4f834e8ef5dc4e846be6126b7 WatchSource:0}: Error finding container e3b363b521467483ab20bf591cf3ec1cd66341c4f834e8ef5dc4e846be6126b7: Status 404 returned error can't find the container with id e3b363b521467483ab20bf591cf3ec1cd66341c4f834e8ef5dc4e846be6126b7 Apr 22 18:37:25.201351 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:25.200799 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ndm6t"] Apr 22 18:37:25.281655 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:25.281457 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nlsz6"] Apr 22 18:37:25.284894 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:37:25.284865 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43bb7de4_00c0_4077_a922_4be165674fa5.slice/crio-4ca82b041d0c6fc9090860acee1185627a78f89728b06cdeb16607beb35b00b9 WatchSource:0}: Error finding container 4ca82b041d0c6fc9090860acee1185627a78f89728b06cdeb16607beb35b00b9: Status 404 returned error can't find the container with id 4ca82b041d0c6fc9090860acee1185627a78f89728b06cdeb16607beb35b00b9 Apr 22 18:37:26.133476 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:26.133415 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nlsz6" event={"ID":"43bb7de4-00c0-4077-a922-4be165674fa5","Type":"ContainerStarted","Data":"4ca82b041d0c6fc9090860acee1185627a78f89728b06cdeb16607beb35b00b9"} Apr 22 18:37:26.134779 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:26.134740 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fcfqp" event={"ID":"9a0dbd88-fa39-4383-b147-1af4bcb23b83","Type":"ContainerStarted","Data":"e3b363b521467483ab20bf591cf3ec1cd66341c4f834e8ef5dc4e846be6126b7"} Apr 22 18:37:26.136979 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:26.136945 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ndm6t" event={"ID":"d86ab162-e73e-453a-99c0-3c1735a2db66","Type":"ContainerStarted","Data":"f89d1b05dc9467bc7cb178b7352fb4e3a93ceeda1eeb110796db59657e95507f"} Apr 22 18:37:26.137095 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:26.136988 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ndm6t" event={"ID":"d86ab162-e73e-453a-99c0-3c1735a2db66","Type":"ContainerStarted","Data":"249cd7d98a3b3506f6e9d8093e8234b1a7e18a6b3e551b2d6ee17afe180cf628"} Apr 22 18:37:26.137095 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:26.137002 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ndm6t" event={"ID":"d86ab162-e73e-453a-99c0-3c1735a2db66","Type":"ContainerStarted","Data":"10ec0a6268d97f924276db7bde858ac16c479dfb72857452b8a64bc7280e2863"} Apr 22 18:37:27.073571 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.073519 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ddhgb"] Apr 22 18:37:27.076739 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.076710 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-ddhgb" Apr 22 18:37:27.081435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.081203 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 18:37:27.081435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.081247 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-bgzsm\"" Apr 22 18:37:27.081435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.081305 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:37:27.081718 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.081671 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:37:27.081906 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.081878 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:37:27.082276 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.082257 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 18:37:27.087376 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.087351 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ddhgb"] Apr 22 18:37:27.200025 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.199732 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39e00ba2-0cd0-486e-be96-48d1b4f2448f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ddhgb\" (UID: \"39e00ba2-0cd0-486e-be96-48d1b4f2448f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ddhgb" Apr 22 18:37:27.200025 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.199785 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/39e00ba2-0cd0-486e-be96-48d1b4f2448f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ddhgb\" (UID: \"39e00ba2-0cd0-486e-be96-48d1b4f2448f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ddhgb" Apr 22 18:37:27.200025 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.199808 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/39e00ba2-0cd0-486e-be96-48d1b4f2448f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ddhgb\" (UID: \"39e00ba2-0cd0-486e-be96-48d1b4f2448f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ddhgb" Apr 22 18:37:27.200025 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.199839 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdlc6\" (UniqueName: \"kubernetes.io/projected/39e00ba2-0cd0-486e-be96-48d1b4f2448f-kube-api-access-zdlc6\") pod \"prometheus-operator-5676c8c784-ddhgb\" (UID: \"39e00ba2-0cd0-486e-be96-48d1b4f2448f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ddhgb" Apr 22 18:37:27.300424 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.300390 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/39e00ba2-0cd0-486e-be96-48d1b4f2448f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ddhgb\" (UID: \"39e00ba2-0cd0-486e-be96-48d1b4f2448f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ddhgb" Apr 22 18:37:27.300565 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.300435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/39e00ba2-0cd0-486e-be96-48d1b4f2448f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ddhgb\" (UID: \"39e00ba2-0cd0-486e-be96-48d1b4f2448f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ddhgb" Apr 22 18:37:27.300565 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.300459 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdlc6\" (UniqueName: \"kubernetes.io/projected/39e00ba2-0cd0-486e-be96-48d1b4f2448f-kube-api-access-zdlc6\") pod \"prometheus-operator-5676c8c784-ddhgb\" (UID: \"39e00ba2-0cd0-486e-be96-48d1b4f2448f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ddhgb" Apr 22 18:37:27.300565 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.300554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39e00ba2-0cd0-486e-be96-48d1b4f2448f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ddhgb\" (UID: \"39e00ba2-0cd0-486e-be96-48d1b4f2448f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ddhgb" Apr 22 18:37:27.301308 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.301291 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39e00ba2-0cd0-486e-be96-48d1b4f2448f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ddhgb\" (UID: \"39e00ba2-0cd0-486e-be96-48d1b4f2448f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ddhgb" Apr 22 18:37:27.304375 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.304342 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/39e00ba2-0cd0-486e-be96-48d1b4f2448f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ddhgb\" (UID: \"39e00ba2-0cd0-486e-be96-48d1b4f2448f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ddhgb" Apr 22 18:37:27.304375 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.304367 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/39e00ba2-0cd0-486e-be96-48d1b4f2448f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ddhgb\" (UID: \"39e00ba2-0cd0-486e-be96-48d1b4f2448f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ddhgb" Apr 22 18:37:27.315230 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.315202 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdlc6\" (UniqueName: \"kubernetes.io/projected/39e00ba2-0cd0-486e-be96-48d1b4f2448f-kube-api-access-zdlc6\") pod \"prometheus-operator-5676c8c784-ddhgb\" (UID: \"39e00ba2-0cd0-486e-be96-48d1b4f2448f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ddhgb" Apr 22 18:37:27.389556 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.389463 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-ddhgb" Apr 22 18:37:27.898256 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:27.898038 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ddhgb"] Apr 22 18:37:27.903133 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:37:27.903097 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39e00ba2_0cd0_486e_be96_48d1b4f2448f.slice/crio-58e3c4078034f1a2bff59840105b286fdc46816335aa209d7fd653629ad7c2b5 WatchSource:0}: Error finding container 58e3c4078034f1a2bff59840105b286fdc46816335aa209d7fd653629ad7c2b5: Status 404 returned error can't find the container with id 58e3c4078034f1a2bff59840105b286fdc46816335aa209d7fd653629ad7c2b5 Apr 22 18:37:28.146862 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.146820 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nlsz6" event={"ID":"43bb7de4-00c0-4077-a922-4be165674fa5","Type":"ContainerStarted","Data":"3b5cff7ab86de42d7b609c28d37e267e0d0222bd21914742389e9c5033042274"} Apr 22 18:37:28.146862 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.146865 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nlsz6" event={"ID":"43bb7de4-00c0-4077-a922-4be165674fa5","Type":"ContainerStarted","Data":"8486f6947a7b0e170421a94505d02542f7c4e7573598db21081831e5bf2efefb"} Apr 22 18:37:28.147672 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.146892 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-nlsz6" Apr 22 18:37:28.148774 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.148747 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ndm6t" event={"ID":"d86ab162-e73e-453a-99c0-3c1735a2db66","Type":"ContainerStarted","Data":"2e50c2cc19d87cad3a19cad060b3afd6a875067ecea1e0cf7683334380362f3f"} Apr 22 18:37:28.150904 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.150874 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fcfqp" event={"ID":"9a0dbd88-fa39-4383-b147-1af4bcb23b83","Type":"ContainerStarted","Data":"7f67bc57ce33bbe85e47c1d531d7c08e995b51902aa4d022463e9b8656091878"} Apr 22 18:37:28.151859 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.151833 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ddhgb" event={"ID":"39e00ba2-0cd0-486e-be96-48d1b4f2448f","Type":"ContainerStarted","Data":"58e3c4078034f1a2bff59840105b286fdc46816335aa209d7fd653629ad7c2b5"} Apr 22 18:37:28.200810 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.200756 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-ndm6t" podStartSLOduration=1.7284116630000002 podStartE2EDuration="4.200741492s" podCreationTimestamp="2026-04-22 18:37:24 +0000 UTC" firstStartedPulling="2026-04-22 18:37:25.298307596 +0000 UTC m=+41.005530002" lastFinishedPulling="2026-04-22 18:37:27.770637412 +0000 UTC m=+43.477859831" observedRunningTime="2026-04-22 18:37:28.200553289 +0000 UTC m=+43.907775716" watchObservedRunningTime="2026-04-22 18:37:28.200741492 +0000 UTC m=+43.907963922" Apr 22 18:37:28.201046 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.200873 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nlsz6" podStartSLOduration=1.7820303119999998 podStartE2EDuration="4.200864905s" podCreationTimestamp="2026-04-22 18:37:24 +0000 UTC" firstStartedPulling="2026-04-22 18:37:25.286730803 +0000 UTC m=+40.993953209" lastFinishedPulling="2026-04-22 18:37:27.705565396 +0000 UTC m=+43.412787802" observedRunningTime="2026-04-22 18:37:28.170432068 +0000 UTC m=+43.877654496" watchObservedRunningTime="2026-04-22 18:37:28.200864905 +0000 UTC m=+43.908087334" Apr 22 18:37:28.226246 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.226140 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fcfqp" podStartSLOduration=1.716529374 podStartE2EDuration="4.226122769s" podCreationTimestamp="2026-04-22 18:37:24 +0000 UTC" firstStartedPulling="2026-04-22 18:37:25.198780907 +0000 UTC m=+40.906003313" lastFinishedPulling="2026-04-22 18:37:27.708374303 +0000 UTC m=+43.415596708" observedRunningTime="2026-04-22 18:37:28.225100915 +0000 UTC m=+43.932323340" watchObservedRunningTime="2026-04-22 18:37:28.226122769 +0000 UTC m=+43.933345194" Apr 22 18:37:28.415266 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.415229 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c7797494-lgnnv"] Apr 22 18:37:28.418079 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.418063 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.420966 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.420937 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 18:37:28.421091 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.421067 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-qrjch\"" Apr 22 18:37:28.421091 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.421080 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 18:37:28.421312 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.421161 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 18:37:28.421480 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.421461 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 18:37:28.421551 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.421541 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 18:37:28.421756 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.421741 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 18:37:28.422430 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.422412 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 18:37:28.436463 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.436430 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c7797494-lgnnv"] Apr 22 18:37:28.512763 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.512684 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a76ee650-7c49-4aeb-9e0e-d795265c262a-console-serving-cert\") pod \"console-6c7797494-lgnnv\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.512763 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.512732 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a76ee650-7c49-4aeb-9e0e-d795265c262a-console-oauth-config\") pod \"console-6c7797494-lgnnv\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.512763 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.512758 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a76ee650-7c49-4aeb-9e0e-d795265c262a-service-ca\") pod \"console-6c7797494-lgnnv\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.513029 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.512781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a76ee650-7c49-4aeb-9e0e-d795265c262a-oauth-serving-cert\") pod \"console-6c7797494-lgnnv\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.513029 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.512891 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a76ee650-7c49-4aeb-9e0e-d795265c262a-console-config\") pod \"console-6c7797494-lgnnv\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.513029 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.512912 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nzz2\" (UniqueName: \"kubernetes.io/projected/a76ee650-7c49-4aeb-9e0e-d795265c262a-kube-api-access-7nzz2\") pod \"console-6c7797494-lgnnv\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.613806 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.613766 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a76ee650-7c49-4aeb-9e0e-d795265c262a-console-config\") pod \"console-6c7797494-lgnnv\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.613806 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.613801 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nzz2\" (UniqueName: \"kubernetes.io/projected/a76ee650-7c49-4aeb-9e0e-d795265c262a-kube-api-access-7nzz2\") pod \"console-6c7797494-lgnnv\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.614070 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.613819 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a76ee650-7c49-4aeb-9e0e-d795265c262a-console-serving-cert\") pod \"console-6c7797494-lgnnv\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.614070 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.613842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a76ee650-7c49-4aeb-9e0e-d795265c262a-console-oauth-config\") pod \"console-6c7797494-lgnnv\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.614070 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.613856 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a76ee650-7c49-4aeb-9e0e-d795265c262a-service-ca\") pod \"console-6c7797494-lgnnv\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.614070 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.613878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a76ee650-7c49-4aeb-9e0e-d795265c262a-oauth-serving-cert\") pod \"console-6c7797494-lgnnv\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.614606 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.614582 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a76ee650-7c49-4aeb-9e0e-d795265c262a-oauth-serving-cert\") pod \"console-6c7797494-lgnnv\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.614702 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.614613 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a76ee650-7c49-4aeb-9e0e-d795265c262a-console-config\") pod \"console-6c7797494-lgnnv\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.614702 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.614587 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a76ee650-7c49-4aeb-9e0e-d795265c262a-service-ca\") pod \"console-6c7797494-lgnnv\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.617505 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.617489 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a76ee650-7c49-4aeb-9e0e-d795265c262a-console-oauth-config\") pod \"console-6c7797494-lgnnv\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.617634 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.617613 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a76ee650-7c49-4aeb-9e0e-d795265c262a-console-serving-cert\") pod \"console-6c7797494-lgnnv\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.625370 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.625347 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nzz2\" (UniqueName: \"kubernetes.io/projected/a76ee650-7c49-4aeb-9e0e-d795265c262a-kube-api-access-7nzz2\") pod \"console-6c7797494-lgnnv\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.727645 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.727604 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:28.871221 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:28.871163 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c7797494-lgnnv"] Apr 22 18:37:28.875889 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:37:28.875857 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda76ee650_7c49_4aeb_9e0e_d795265c262a.slice/crio-45ae5b95a5b627270acfee1d5e6aaa9504b22cce89a38c4c743afc5b8e5b7c5a WatchSource:0}: Error finding container 45ae5b95a5b627270acfee1d5e6aaa9504b22cce89a38c4c743afc5b8e5b7c5a: Status 404 returned error can't find the container with id 45ae5b95a5b627270acfee1d5e6aaa9504b22cce89a38c4c743afc5b8e5b7c5a Apr 22 18:37:29.156980 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:29.156900 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c7797494-lgnnv" event={"ID":"a76ee650-7c49-4aeb-9e0e-d795265c262a","Type":"ContainerStarted","Data":"45ae5b95a5b627270acfee1d5e6aaa9504b22cce89a38c4c743afc5b8e5b7c5a"} Apr 22 18:37:30.161669 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:30.161407 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ddhgb" event={"ID":"39e00ba2-0cd0-486e-be96-48d1b4f2448f","Type":"ContainerStarted","Data":"0508d91d4854189ac10463788c0291492c1cd718bbac170b9226ee37665de3fe"} Apr 22 18:37:30.162219 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:30.161679 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ddhgb" event={"ID":"39e00ba2-0cd0-486e-be96-48d1b4f2448f","Type":"ContainerStarted","Data":"d2f81bb202597597c5526b1348b9234a7b0b7101551d75276f8d17bfc32d0876"} Apr 22 18:37:30.191759 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:30.191708 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-ddhgb" podStartSLOduration=1.725655986 podStartE2EDuration="3.191691154s" podCreationTimestamp="2026-04-22 18:37:27 +0000 UTC" firstStartedPulling="2026-04-22 18:37:27.906310337 +0000 UTC m=+43.613532747" lastFinishedPulling="2026-04-22 18:37:29.372345509 +0000 UTC m=+45.079567915" observedRunningTime="2026-04-22 18:37:30.191322537 +0000 UTC m=+45.898544967" watchObservedRunningTime="2026-04-22 18:37:30.191691154 +0000 UTC m=+45.898913582" Apr 22 18:37:32.169029 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:32.168859 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c7797494-lgnnv" event={"ID":"a76ee650-7c49-4aeb-9e0e-d795265c262a","Type":"ContainerStarted","Data":"ef9413686857fd58ee35e9e92121f08adf26588f937f3d6c8ca74685832446c8"} Apr 22 18:37:32.192458 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:32.192406 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c7797494-lgnnv" podStartSLOduration=1.203463639 podStartE2EDuration="4.192389825s" podCreationTimestamp="2026-04-22 18:37:28 +0000 UTC" firstStartedPulling="2026-04-22 18:37:28.878454376 +0000 UTC m=+44.585676786" lastFinishedPulling="2026-04-22 18:37:31.867380561 +0000 UTC m=+47.574602972" observedRunningTime="2026-04-22 18:37:32.190300423 +0000 UTC m=+47.897522851" watchObservedRunningTime="2026-04-22 18:37:32.192389825 +0000 UTC m=+47.899612254" Apr 22 18:37:35.956427 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:35.956251 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-zzrnk"] Apr 22 18:37:35.989903 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:35.989857 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-zzrnk"] Apr 22 18:37:35.990089 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:35.990019 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:35.992938 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:35.992890 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:37:35.993245 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:35.993228 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 18:37:35.994721 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:35.994702 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 18:37:35.994831 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:35.994766 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-zm7j4\"" Apr 22 18:37:36.064410 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.064374 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd"] Apr 22 18:37:36.073973 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.073911 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e60c41f2-43ea-4538-b1cf-948a3af04d39-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-zzrnk\" (UID: \"e60c41f2-43ea-4538-b1cf-948a3af04d39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.073973 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.073972 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqzc4\" (UniqueName: \"kubernetes.io/projected/e60c41f2-43ea-4538-b1cf-948a3af04d39-kube-api-access-kqzc4\") pod \"kube-state-metrics-69db897b98-zzrnk\" (UID: \"e60c41f2-43ea-4538-b1cf-948a3af04d39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.074114 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.073995 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e60c41f2-43ea-4538-b1cf-948a3af04d39-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-zzrnk\" (UID: \"e60c41f2-43ea-4538-b1cf-948a3af04d39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.074114 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.074052 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/e60c41f2-43ea-4538-b1cf-948a3af04d39-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-zzrnk\" (UID: \"e60c41f2-43ea-4538-b1cf-948a3af04d39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.074173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.074112 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/e60c41f2-43ea-4538-b1cf-948a3af04d39-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-zzrnk\" (UID: \"e60c41f2-43ea-4538-b1cf-948a3af04d39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.074173 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.074145 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e60c41f2-43ea-4538-b1cf-948a3af04d39-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-zzrnk\" (UID: \"e60c41f2-43ea-4538-b1cf-948a3af04d39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.087400 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.087378 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd"] Apr 22 18:37:36.087518 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.087494 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" Apr 22 18:37:36.090736 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.090713 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 18:37:36.091010 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.090998 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:37:36.093626 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.093606 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-wwgff\"" Apr 22 18:37:36.105916 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.105894 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4pgk8"] Apr 22 18:37:36.125819 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.125787 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.129014 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.128989 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:37:36.129014 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.128990 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:37:36.129594 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.129574 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6pfl9\"" Apr 22 18:37:36.133088 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.133065 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:37:36.175452 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.175415 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqzc4\" (UniqueName: \"kubernetes.io/projected/e60c41f2-43ea-4538-b1cf-948a3af04d39-kube-api-access-kqzc4\") pod \"kube-state-metrics-69db897b98-zzrnk\" (UID: \"e60c41f2-43ea-4538-b1cf-948a3af04d39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.175452 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.175456 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cee7930e-83ce-4067-9398-494f9751737a-root\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.175695 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.175492 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-hzthd\" (UID: \"bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" Apr 22 18:37:36.175695 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.175519 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/e60c41f2-43ea-4538-b1cf-948a3af04d39-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-zzrnk\" (UID: \"e60c41f2-43ea-4538-b1cf-948a3af04d39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.175695 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.175561 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-hzthd\" (UID: \"bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" Apr 22 18:37:36.175695 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.175619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/e60c41f2-43ea-4538-b1cf-948a3af04d39-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-zzrnk\" (UID: \"e60c41f2-43ea-4538-b1cf-948a3af04d39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.175695 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.175647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-wtmp\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.175695 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.175682 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e60c41f2-43ea-4538-b1cf-948a3af04d39-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-zzrnk\" (UID: \"e60c41f2-43ea-4538-b1cf-948a3af04d39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.176036 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.175773 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mnpk\" (UniqueName: \"kubernetes.io/projected/cee7930e-83ce-4067-9398-494f9751737a-kube-api-access-5mnpk\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.176036 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.175838 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cee7930e-83ce-4067-9398-494f9751737a-sys\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.176036 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.175891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e60c41f2-43ea-4538-b1cf-948a3af04d39-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-zzrnk\" (UID: \"e60c41f2-43ea-4538-b1cf-948a3af04d39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.176036 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.175942 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e60c41f2-43ea-4538-b1cf-948a3af04d39-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-zzrnk\" (UID: \"e60c41f2-43ea-4538-b1cf-948a3af04d39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.176036 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.175977 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-accelerators-collector-config\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.176036 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.176013 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-hzthd\" (UID: \"bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" Apr 22 18:37:36.176036 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.176025 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/e60c41f2-43ea-4538-b1cf-948a3af04d39-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-zzrnk\" (UID: \"e60c41f2-43ea-4538-b1cf-948a3af04d39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.176375 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.176056 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-textfile\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.176375 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.176102 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5hjg\" (UniqueName: \"kubernetes.io/projected/bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e-kube-api-access-f5hjg\") pod \"openshift-state-metrics-9d44df66c-hzthd\" (UID: \"bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" Apr 22 18:37:36.176375 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.176130 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.176375 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.176164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-tls\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.176375 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.176189 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cee7930e-83ce-4067-9398-494f9751737a-metrics-client-ca\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.176375 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.176328 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/e60c41f2-43ea-4538-b1cf-948a3af04d39-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-zzrnk\" (UID: \"e60c41f2-43ea-4538-b1cf-948a3af04d39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.176664 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.176645 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e60c41f2-43ea-4538-b1cf-948a3af04d39-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-zzrnk\" (UID: \"e60c41f2-43ea-4538-b1cf-948a3af04d39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.181218 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.181198 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e60c41f2-43ea-4538-b1cf-948a3af04d39-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-zzrnk\" (UID: \"e60c41f2-43ea-4538-b1cf-948a3af04d39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.181371 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.181351 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e60c41f2-43ea-4538-b1cf-948a3af04d39-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-zzrnk\" (UID: \"e60c41f2-43ea-4538-b1cf-948a3af04d39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.192164 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.192127 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqzc4\" (UniqueName: \"kubernetes.io/projected/e60c41f2-43ea-4538-b1cf-948a3af04d39-kube-api-access-kqzc4\") pod \"kube-state-metrics-69db897b98-zzrnk\" (UID: \"e60c41f2-43ea-4538-b1cf-948a3af04d39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.276593 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.276479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cee7930e-83ce-4067-9398-494f9751737a-sys\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.276593 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.276522 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-accelerators-collector-config\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.276593 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.276542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-hzthd\" (UID: \"bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" Apr 22 18:37:36.276593 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.276572 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-textfile\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.276593 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.276588 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cee7930e-83ce-4067-9398-494f9751737a-sys\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.277003 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.276655 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5hjg\" (UniqueName: \"kubernetes.io/projected/bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e-kube-api-access-f5hjg\") pod \"openshift-state-metrics-9d44df66c-hzthd\" (UID: \"bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" Apr 22 18:37:36.277003 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.276672 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.277003 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.276698 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-tls\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.277003 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.276714 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cee7930e-83ce-4067-9398-494f9751737a-metrics-client-ca\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.277003 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.276752 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cee7930e-83ce-4067-9398-494f9751737a-root\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.277003 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.276805 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-hzthd\" (UID: \"bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" Apr 22 18:37:36.277003 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:36.276814 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:37:36.277003 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.276832 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-hzthd\" (UID: \"bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" Apr 22 18:37:36.277003 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.276871 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-wtmp\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.277003 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.276905 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mnpk\" (UniqueName: \"kubernetes.io/projected/cee7930e-83ce-4067-9398-494f9751737a-kube-api-access-5mnpk\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.277003 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.276984 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-textfile\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.277467 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.277188 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-accelerators-collector-config\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.277467 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.277206 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-wtmp\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.277467 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:36.277225 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-tls podName:cee7930e-83ce-4067-9398-494f9751737a nodeName:}" failed. No retries permitted until 2026-04-22 18:37:36.777195358 +0000 UTC m=+52.484417785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-tls") pod "node-exporter-4pgk8" (UID: "cee7930e-83ce-4067-9398-494f9751737a") : secret "node-exporter-tls" not found Apr 22 18:37:36.277467 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.277254 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cee7930e-83ce-4067-9398-494f9751737a-root\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.277467 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.277303 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cee7930e-83ce-4067-9398-494f9751737a-metrics-client-ca\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.277648 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.277615 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-hzthd\" (UID: \"bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" Apr 22 18:37:36.279376 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.279348 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.279587 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.279567 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-hzthd\" (UID: \"bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" Apr 22 18:37:36.279661 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.279635 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-hzthd\" (UID: \"bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" Apr 22 18:37:36.290276 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.290242 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5hjg\" (UniqueName: \"kubernetes.io/projected/bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e-kube-api-access-f5hjg\") pod \"openshift-state-metrics-9d44df66c-hzthd\" (UID: \"bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" Apr 22 18:37:36.300356 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.300311 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" Apr 22 18:37:36.310422 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.310392 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mnpk\" (UniqueName: \"kubernetes.io/projected/cee7930e-83ce-4067-9398-494f9751737a-kube-api-access-5mnpk\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.396815 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.396515 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" Apr 22 18:37:36.448408 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.448348 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-zzrnk"] Apr 22 18:37:36.450367 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:37:36.450324 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode60c41f2_43ea_4538_b1cf_948a3af04d39.slice/crio-75d79527c23e2d03f3d224d11afd075e2d2e93903f79cfeff5130ac8ce8d4a24 WatchSource:0}: Error finding container 75d79527c23e2d03f3d224d11afd075e2d2e93903f79cfeff5130ac8ce8d4a24: Status 404 returned error can't find the container with id 75d79527c23e2d03f3d224d11afd075e2d2e93903f79cfeff5130ac8ce8d4a24 Apr 22 18:37:36.529158 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.529079 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd"] Apr 22 18:37:36.532684 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:37:36.532654 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd1bf2b3_bfa7_4038_bcc0_3fe40aa48c6e.slice/crio-d7566537f620d80e5e103a3f86970f0136c220a96293ed313539da9e87594dad WatchSource:0}: Error finding container d7566537f620d80e5e103a3f86970f0136c220a96293ed313539da9e87594dad: Status 404 returned error can't find the container with id d7566537f620d80e5e103a3f86970f0136c220a96293ed313539da9e87594dad Apr 22 18:37:36.780188 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:36.780103 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-tls\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:36.780341 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:36.780252 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:37:36.780341 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:36.780332 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-tls podName:cee7930e-83ce-4067-9398-494f9751737a nodeName:}" failed. No retries permitted until 2026-04-22 18:37:37.780312482 +0000 UTC m=+53.487534888 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-tls") pod "node-exporter-4pgk8" (UID: "cee7930e-83ce-4067-9398-494f9751737a") : secret "node-exporter-tls" not found Apr 22 18:37:37.015236 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.015205 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6db48549fc-qc97q"] Apr 22 18:37:37.034134 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.034050 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6db48549fc-qc97q"] Apr 22 18:37:37.034269 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.034178 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.047112 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.047085 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 18:37:37.082273 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.082231 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-console-config\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.082470 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.082299 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-trusted-ca-bundle\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.082470 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.082367 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb2jn\" (UniqueName: \"kubernetes.io/projected/955cdd70-155a-4867-a898-966f84075a1a-kube-api-access-tb2jn\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.082470 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.082410 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/955cdd70-155a-4867-a898-966f84075a1a-console-oauth-config\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.082470 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.082444 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/955cdd70-155a-4867-a898-966f84075a1a-console-serving-cert\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.082663 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.082505 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-service-ca\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.082663 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.082570 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-oauth-serving-cert\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.181575 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.181525 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" event={"ID":"bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e","Type":"ContainerStarted","Data":"8a8759d9b8c0261020f4c889276868e8eed04169624d476963cfb642f457319d"} Apr 22 18:37:37.181575 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.181576 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" event={"ID":"bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e","Type":"ContainerStarted","Data":"4c4b8eacacec0428cdbcd32a1e637b7a2f5ff164f666c3d8b518d95b8f638e46"} Apr 22 18:37:37.181802 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.181590 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" event={"ID":"bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e","Type":"ContainerStarted","Data":"d7566537f620d80e5e103a3f86970f0136c220a96293ed313539da9e87594dad"} Apr 22 18:37:37.182951 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.182687 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" event={"ID":"e60c41f2-43ea-4538-b1cf-948a3af04d39","Type":"ContainerStarted","Data":"75d79527c23e2d03f3d224d11afd075e2d2e93903f79cfeff5130ac8ce8d4a24"} Apr 22 18:37:37.183322 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.183299 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tb2jn\" (UniqueName: \"kubernetes.io/projected/955cdd70-155a-4867-a898-966f84075a1a-kube-api-access-tb2jn\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.183453 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.183334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/955cdd70-155a-4867-a898-966f84075a1a-console-oauth-config\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.183453 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.183355 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/955cdd70-155a-4867-a898-966f84075a1a-console-serving-cert\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.183453 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.183379 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-service-ca\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.183453 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.183431 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-oauth-serving-cert\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.183673 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.183467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-console-config\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.183673 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.183516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-trusted-ca-bundle\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.184482 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.184366 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-service-ca\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.184482 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.184485 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-trusted-ca-bundle\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.184810 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.184784 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-oauth-serving-cert\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.185242 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.185218 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-console-config\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.188409 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.188381 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/955cdd70-155a-4867-a898-966f84075a1a-console-oauth-config\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.188706 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.188687 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/955cdd70-155a-4867-a898-966f84075a1a-console-serving-cert\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.196548 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.196515 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb2jn\" (UniqueName: \"kubernetes.io/projected/955cdd70-155a-4867-a898-966f84075a1a-kube-api-access-tb2jn\") pod \"console-6db48549fc-qc97q\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.204841 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.204316 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:37:37.231337 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.231253 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:37:37.231497 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.231470 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.234765 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.234734 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 18:37:37.234765 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.234750 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-bv9nn\"" Apr 22 18:37:37.235016 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.234857 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 18:37:37.235016 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.234897 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 18:37:37.235016 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.234908 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 18:37:37.236020 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.235977 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 18:37:37.236149 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.236039 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 18:37:37.236383 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.236366 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 18:37:37.236477 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.236428 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 18:37:37.239263 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.239242 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 18:37:37.284897 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.284817 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c70d334e-87f8-4f5e-a03b-6bd1312b7570-config-out\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.284897 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.284851 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70d334e-87f8-4f5e-a03b-6bd1312b7570-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.284897 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.284886 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.285201 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.284965 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c70d334e-87f8-4f5e-a03b-6bd1312b7570-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.285201 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.284997 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-config-volume\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.285201 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.285133 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.285201 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.285183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-web-config\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.285363 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.285208 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c70d334e-87f8-4f5e-a03b-6bd1312b7570-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.285363 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.285237 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.285363 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.285272 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np4wp\" (UniqueName: \"kubernetes.io/projected/c70d334e-87f8-4f5e-a03b-6bd1312b7570-kube-api-access-np4wp\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.285363 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.285299 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c70d334e-87f8-4f5e-a03b-6bd1312b7570-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.285363 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.285324 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.285363 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.285353 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.343742 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.343697 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:37.386114 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.386083 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c70d334e-87f8-4f5e-a03b-6bd1312b7570-config-out\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.386114 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.386121 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70d334e-87f8-4f5e-a03b-6bd1312b7570-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.386310 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.386153 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.386310 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.386179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c70d334e-87f8-4f5e-a03b-6bd1312b7570-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.386310 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.386200 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-config-volume\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.386310 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.386261 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.386310 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.386286 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-web-config\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.386310 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.386307 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c70d334e-87f8-4f5e-a03b-6bd1312b7570-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.388414 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.386324 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.388414 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.386347 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-np4wp\" (UniqueName: \"kubernetes.io/projected/c70d334e-87f8-4f5e-a03b-6bd1312b7570-kube-api-access-np4wp\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.388414 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.386369 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c70d334e-87f8-4f5e-a03b-6bd1312b7570-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.388414 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.386389 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.388414 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.386416 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.388414 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:37.386496 2575 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 22 18:37:37.388414 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:37.386556 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-secret-alertmanager-main-tls podName:c70d334e-87f8-4f5e-a03b-6bd1312b7570 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:37.886540141 +0000 UTC m=+53.593762547 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "c70d334e-87f8-4f5e-a03b-6bd1312b7570") : secret "alertmanager-main-tls" not found Apr 22 18:37:37.388414 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.387685 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c70d334e-87f8-4f5e-a03b-6bd1312b7570-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.388414 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.387998 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c70d334e-87f8-4f5e-a03b-6bd1312b7570-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.388868 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.388708 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c70d334e-87f8-4f5e-a03b-6bd1312b7570-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.390255 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.390212 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.390414 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.390394 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.391956 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.391889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c70d334e-87f8-4f5e-a03b-6bd1312b7570-config-out\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.392457 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.392434 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-web-config\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.392697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.392681 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.393042 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.393023 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-config-volume\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.394172 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.394149 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.394896 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.394877 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c70d334e-87f8-4f5e-a03b-6bd1312b7570-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.401701 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.401674 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-np4wp\" (UniqueName: \"kubernetes.io/projected/c70d334e-87f8-4f5e-a03b-6bd1312b7570-kube-api-access-np4wp\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.497493 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.497457 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6db48549fc-qc97q"] Apr 22 18:37:37.500806 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:37:37.500773 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod955cdd70_155a_4867_a898_966f84075a1a.slice/crio-c51f49647453324943ed39a8a5202d0f3e5dad7c1da4ae6c7ec3c7a02de8d320 WatchSource:0}: Error finding container c51f49647453324943ed39a8a5202d0f3e5dad7c1da4ae6c7ec3c7a02de8d320: Status 404 returned error can't find the container with id c51f49647453324943ed39a8a5202d0f3e5dad7c1da4ae6c7ec3c7a02de8d320 Apr 22 18:37:37.792011 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.791907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-tls\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:37.794570 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.794548 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cee7930e-83ce-4067-9398-494f9751737a-node-exporter-tls\") pod \"node-exporter-4pgk8\" (UID: \"cee7930e-83ce-4067-9398-494f9751737a\") " pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:37.892818 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.892775 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.895947 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.895894 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c70d334e-87f8-4f5e-a03b-6bd1312b7570-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c70d334e-87f8-4f5e-a03b-6bd1312b7570\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:37.934415 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:37.934379 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4pgk8" Apr 22 18:37:38.073906 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.073834 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6"] Apr 22 18:37:38.083291 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.083268 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.088245 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.088220 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 18:37:38.088714 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.088457 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 18:37:38.088714 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.088466 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-zsgpb\"" Apr 22 18:37:38.088714 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.088568 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 18:37:38.088714 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.088625 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 18:37:38.088714 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.088691 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 18:37:38.089197 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.088775 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6"] Apr 22 18:37:38.089197 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.088790 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-8u6m5cgdmdrdi\"" Apr 22 18:37:38.093941 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.093903 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-secret-thanos-querier-tls\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.094039 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.093958 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvqwg\" (UniqueName: \"kubernetes.io/projected/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-kube-api-access-rvqwg\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.094039 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.094020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.094142 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.094063 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-secret-grpc-tls\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.094142 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.094096 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.094234 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.094184 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.094234 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.094227 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.094338 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.094288 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-metrics-client-ca\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.143472 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.143428 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:37:38.159478 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.159447 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nlsz6" Apr 22 18:37:38.187315 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.187280 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db48549fc-qc97q" event={"ID":"955cdd70-155a-4867-a898-966f84075a1a","Type":"ContainerStarted","Data":"349588bce8a426dc5382a46360cdfb1dcb2677bdc59eb04912eecb17544e838d"} Apr 22 18:37:38.187480 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.187323 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db48549fc-qc97q" event={"ID":"955cdd70-155a-4867-a898-966f84075a1a","Type":"ContainerStarted","Data":"c51f49647453324943ed39a8a5202d0f3e5dad7c1da4ae6c7ec3c7a02de8d320"} Apr 22 18:37:38.194854 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.194816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-secret-thanos-querier-tls\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.195051 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.194863 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvqwg\" (UniqueName: \"kubernetes.io/projected/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-kube-api-access-rvqwg\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.195051 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.194896 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.195051 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.194949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-secret-grpc-tls\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.195051 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.195002 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.195604 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.195459 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.195604 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.195542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.200628 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.195674 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-metrics-client-ca\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.200628 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.196704 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-metrics-client-ca\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.201182 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.201155 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.201300 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.201249 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-secret-thanos-querier-tls\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.201300 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.201247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.201420 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.201350 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.201535 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.201515 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.204028 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.204000 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-secret-grpc-tls\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.204771 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.204748 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvqwg\" (UniqueName: \"kubernetes.io/projected/3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909-kube-api-access-rvqwg\") pod \"thanos-querier-7ffccd8cb5-bkzp6\" (UID: \"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909\") " pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.394160 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.394078 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:38.511814 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:37:38.511762 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcee7930e_83ce_4067_9398_494f9751737a.slice/crio-b896349061c0ec6a52ecc6065e811a53bc4595cb1cf3889b74187c7d9dfad23b WatchSource:0}: Error finding container b896349061c0ec6a52ecc6065e811a53bc4595cb1cf3889b74187c7d9dfad23b: Status 404 returned error can't find the container with id b896349061c0ec6a52ecc6065e811a53bc4595cb1cf3889b74187c7d9dfad23b Apr 22 18:37:38.696553 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.696199 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6db48549fc-qc97q" podStartSLOduration=2.696176962 podStartE2EDuration="2.696176962s" podCreationTimestamp="2026-04-22 18:37:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:37:38.217184941 +0000 UTC m=+53.924407370" watchObservedRunningTime="2026-04-22 18:37:38.696176962 +0000 UTC m=+54.403399391" Apr 22 18:37:38.696553 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.696517 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:37:38.697709 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:37:38.697664 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc70d334e_87f8_4f5e_a03b_6bd1312b7570.slice/crio-3e97d8cd785e0652f034210528fb0a326619f7aa7b2ca0b6a63de7613719ddba WatchSource:0}: Error finding container 3e97d8cd785e0652f034210528fb0a326619f7aa7b2ca0b6a63de7613719ddba: Status 404 returned error can't find the container with id 3e97d8cd785e0652f034210528fb0a326619f7aa7b2ca0b6a63de7613719ddba Apr 22 18:37:38.728619 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.728127 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:38.728619 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.728171 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:38.736339 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.736311 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:38.745423 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.745371 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:37:38.753578 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:38.753516 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6"] Apr 22 18:37:38.757298 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:37:38.757223 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3578fdd1_e2ae_4aa0_9d35_5dd1f6de3909.slice/crio-6be03352970d67012185fb971fc99aa948a63e9bf23e94241d76f2e4fddce2f7 WatchSource:0}: Error finding container 6be03352970d67012185fb971fc99aa948a63e9bf23e94241d76f2e4fddce2f7: Status 404 returned error can't find the container with id 6be03352970d67012185fb971fc99aa948a63e9bf23e94241d76f2e4fddce2f7 Apr 22 18:37:39.192816 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:39.192752 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4pgk8" event={"ID":"cee7930e-83ce-4067-9398-494f9751737a","Type":"ContainerStarted","Data":"b896349061c0ec6a52ecc6065e811a53bc4595cb1cf3889b74187c7d9dfad23b"} Apr 22 18:37:39.197342 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:39.196887 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" event={"ID":"e60c41f2-43ea-4538-b1cf-948a3af04d39","Type":"ContainerStarted","Data":"7a00eb925a055449654ff13662bdad79cb7a2a7c197760a022ecdbd9e979f6b7"} Apr 22 18:37:39.197342 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:39.196948 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" event={"ID":"e60c41f2-43ea-4538-b1cf-948a3af04d39","Type":"ContainerStarted","Data":"36ad2661a2a73bee13533ac15bde0b46a9575b5ed649c6452e0537fb52ed31d7"} Apr 22 18:37:39.197342 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:39.196968 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" event={"ID":"e60c41f2-43ea-4538-b1cf-948a3af04d39","Type":"ContainerStarted","Data":"c0303d72b08a987b109d419da7e62a232a19d74427dad50534062a82e1e9ece6"} Apr 22 18:37:39.198799 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:39.198770 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" event={"ID":"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909","Type":"ContainerStarted","Data":"6be03352970d67012185fb971fc99aa948a63e9bf23e94241d76f2e4fddce2f7"} Apr 22 18:37:39.200580 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:39.200548 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c70d334e-87f8-4f5e-a03b-6bd1312b7570","Type":"ContainerStarted","Data":"3e97d8cd785e0652f034210528fb0a326619f7aa7b2ca0b6a63de7613719ddba"} Apr 22 18:37:39.202958 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:39.202900 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" event={"ID":"bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e","Type":"ContainerStarted","Data":"238d51975b888290125bebb0408e7440c5c46745cca11c1f1a0f33fa1be296a6"} Apr 22 18:37:39.221326 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:39.221252 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-zzrnk" podStartSLOduration=2.164274633 podStartE2EDuration="4.221235139s" podCreationTimestamp="2026-04-22 18:37:35 +0000 UTC" firstStartedPulling="2026-04-22 18:37:36.452784891 +0000 UTC m=+52.160007296" lastFinishedPulling="2026-04-22 18:37:38.509745379 +0000 UTC m=+54.216967802" observedRunningTime="2026-04-22 18:37:39.221204735 +0000 UTC m=+54.928427165" watchObservedRunningTime="2026-04-22 18:37:39.221235139 +0000 UTC m=+54.928457569" Apr 22 18:37:39.244537 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:39.244474 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hzthd" podStartSLOduration=1.589413223 podStartE2EDuration="3.244455134s" podCreationTimestamp="2026-04-22 18:37:36 +0000 UTC" firstStartedPulling="2026-04-22 18:37:36.85810253 +0000 UTC m=+52.565324936" lastFinishedPulling="2026-04-22 18:37:38.513144426 +0000 UTC m=+54.220366847" observedRunningTime="2026-04-22 18:37:39.243623862 +0000 UTC m=+54.950846291" watchObservedRunningTime="2026-04-22 18:37:39.244455134 +0000 UTC m=+54.951677563" Apr 22 18:37:40.466041 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.465989 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7df4869b5b-rkw66"] Apr 22 18:37:40.489185 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.489146 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7df4869b5b-rkw66"] Apr 22 18:37:40.489344 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.489296 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.494395 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.494049 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-b8gh9\"" Apr 22 18:37:40.494395 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.494080 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:37:40.494395 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.494109 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 18:37:40.494395 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.494145 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 18:37:40.494395 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.494049 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-4j63nnpo3cv5u\"" Apr 22 18:37:40.494395 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.494309 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 18:37:40.518121 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.518083 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-metrics-server-audit-profiles\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.518305 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.518147 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-secret-metrics-server-tls\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.518305 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.518180 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-secret-metrics-server-client-certs\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.518305 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.518203 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-audit-log\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.518305 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.518235 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhrrz\" (UniqueName: \"kubernetes.io/projected/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-kube-api-access-xhrrz\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.518499 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.518341 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.518499 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.518405 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-client-ca-bundle\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.618894 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.618840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-secret-metrics-server-tls\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.618894 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.618888 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-secret-metrics-server-client-certs\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.619174 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.618912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-audit-log\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.619174 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.618959 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhrrz\" (UniqueName: \"kubernetes.io/projected/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-kube-api-access-xhrrz\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.619174 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.619013 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.619174 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.619058 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-client-ca-bundle\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.619174 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.619121 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-metrics-server-audit-profiles\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.620557 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.620524 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-metrics-server-audit-profiles\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.621183 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.621101 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.622022 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.621992 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-secret-metrics-server-tls\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.622199 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.622175 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-secret-metrics-server-client-certs\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.622356 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.622336 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-client-ca-bundle\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.625472 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.625452 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-audit-log\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.630194 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.630168 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhrrz\" (UniqueName: \"kubernetes.io/projected/e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd-kube-api-access-xhrrz\") pod \"metrics-server-7df4869b5b-rkw66\" (UID: \"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd\") " pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.799904 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.799824 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:37:40.828000 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.827964 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-cgrz7"] Apr 22 18:37:40.842730 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.842664 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-cgrz7"] Apr 22 18:37:40.843111 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.843085 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cgrz7" Apr 22 18:37:40.846025 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.845995 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 18:37:40.846025 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.846017 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-dnxlx\"" Apr 22 18:37:40.922597 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:40.922555 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f55f170e-96d3-496c-8e6a-ed27167d9344-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cgrz7\" (UID: \"f55f170e-96d3-496c-8e6a-ed27167d9344\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cgrz7" Apr 22 18:37:41.023653 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:41.023607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f55f170e-96d3-496c-8e6a-ed27167d9344-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cgrz7\" (UID: \"f55f170e-96d3-496c-8e6a-ed27167d9344\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cgrz7" Apr 22 18:37:41.023837 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:41.023759 2575 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 22 18:37:41.023837 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:37:41.023824 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f55f170e-96d3-496c-8e6a-ed27167d9344-monitoring-plugin-cert podName:f55f170e-96d3-496c-8e6a-ed27167d9344 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:41.523808813 +0000 UTC m=+57.231031218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/f55f170e-96d3-496c-8e6a-ed27167d9344-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-cgrz7" (UID: "f55f170e-96d3-496c-8e6a-ed27167d9344") : secret "monitoring-plugin-cert" not found Apr 22 18:37:41.189742 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:41.189709 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7df4869b5b-rkw66"] Apr 22 18:37:41.192541 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:37:41.192504 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6dcfb9f_eef0_420b_a3c2_00d4b1f219dd.slice/crio-d1f6bdfda5aefb5357552d045ed6655d0e41a1679df002a14dcfd92e2ecb1dca WatchSource:0}: Error finding container d1f6bdfda5aefb5357552d045ed6655d0e41a1679df002a14dcfd92e2ecb1dca: Status 404 returned error can't find the container with id d1f6bdfda5aefb5357552d045ed6655d0e41a1679df002a14dcfd92e2ecb1dca Apr 22 18:37:41.211941 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:41.211893 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c70d334e-87f8-4f5e-a03b-6bd1312b7570","Type":"ContainerStarted","Data":"eb473c4f6b1822bd8307155f8b64bc1495d79a1129e4cd4158f53ca0f876d479"} Apr 22 18:37:41.213307 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:41.213280 2575 generic.go:358] "Generic (PLEG): container finished" podID="cee7930e-83ce-4067-9398-494f9751737a" containerID="9e7118017a8376eca86b282d900c57085460b22806298e030ceb7bf8562a383b" exitCode=0 Apr 22 18:37:41.213437 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:41.213363 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4pgk8" event={"ID":"cee7930e-83ce-4067-9398-494f9751737a","Type":"ContainerDied","Data":"9e7118017a8376eca86b282d900c57085460b22806298e030ceb7bf8562a383b"} Apr 22 18:37:41.214349 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:41.214330 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" event={"ID":"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd","Type":"ContainerStarted","Data":"d1f6bdfda5aefb5357552d045ed6655d0e41a1679df002a14dcfd92e2ecb1dca"} Apr 22 18:37:41.528658 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:41.528618 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f55f170e-96d3-496c-8e6a-ed27167d9344-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cgrz7\" (UID: \"f55f170e-96d3-496c-8e6a-ed27167d9344\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cgrz7" Apr 22 18:37:41.531144 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:41.531121 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f55f170e-96d3-496c-8e6a-ed27167d9344-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cgrz7\" (UID: \"f55f170e-96d3-496c-8e6a-ed27167d9344\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cgrz7" Apr 22 18:37:41.757085 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:41.757050 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cgrz7" Apr 22 18:37:41.891405 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:41.891373 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-cgrz7"] Apr 22 18:37:41.894965 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:37:41.894915 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf55f170e_96d3_496c_8e6a_ed27167d9344.slice/crio-a38adfbbcd3fd1d7f10d81a3db5f259fd5c512d4c5cabf6be5debbeea08e3e4a WatchSource:0}: Error finding container a38adfbbcd3fd1d7f10d81a3db5f259fd5c512d4c5cabf6be5debbeea08e3e4a: Status 404 returned error can't find the container with id a38adfbbcd3fd1d7f10d81a3db5f259fd5c512d4c5cabf6be5debbeea08e3e4a Apr 22 18:37:42.219400 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:42.219364 2575 generic.go:358] "Generic (PLEG): container finished" podID="c70d334e-87f8-4f5e-a03b-6bd1312b7570" containerID="eb473c4f6b1822bd8307155f8b64bc1495d79a1129e4cd4158f53ca0f876d479" exitCode=0 Apr 22 18:37:42.219601 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:42.219461 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c70d334e-87f8-4f5e-a03b-6bd1312b7570","Type":"ContainerDied","Data":"eb473c4f6b1822bd8307155f8b64bc1495d79a1129e4cd4158f53ca0f876d479"} Apr 22 18:37:42.223163 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:42.223125 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4pgk8" event={"ID":"cee7930e-83ce-4067-9398-494f9751737a","Type":"ContainerStarted","Data":"7bba9b6d51c75a296b42ea2340f2ce1f743276e202cbd07dd469ad9ca99bc5b1"} Apr 22 18:37:42.223268 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:42.223164 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4pgk8" event={"ID":"cee7930e-83ce-4067-9398-494f9751737a","Type":"ContainerStarted","Data":"728c028d64cba30019f5badfb50b054f2f2014b21a816ea813118066446189c2"} Apr 22 18:37:42.224538 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:42.224498 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cgrz7" event={"ID":"f55f170e-96d3-496c-8e6a-ed27167d9344","Type":"ContainerStarted","Data":"a38adfbbcd3fd1d7f10d81a3db5f259fd5c512d4c5cabf6be5debbeea08e3e4a"} Apr 22 18:37:42.226664 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:42.226599 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" event={"ID":"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909","Type":"ContainerStarted","Data":"671795a5d396dfb86134eee7b4f93910e514a1e620076de1414b8f79562c13fe"} Apr 22 18:37:42.226664 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:42.226629 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" event={"ID":"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909","Type":"ContainerStarted","Data":"d38a62c37980089685a294757f4e06bc63e84dd4ae04fb22da076a17b9c9362e"} Apr 22 18:37:42.226664 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:42.226643 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" event={"ID":"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909","Type":"ContainerStarted","Data":"72f6d4e5d0dd2f5236c5265cd117c843374540778ab8ff9df8a98b8af208c811"} Apr 22 18:37:42.280381 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:42.280323 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4pgk8" podStartSLOduration=4.656980278 podStartE2EDuration="6.280304154s" podCreationTimestamp="2026-04-22 18:37:36 +0000 UTC" firstStartedPulling="2026-04-22 18:37:38.513830326 +0000 UTC m=+54.221052732" lastFinishedPulling="2026-04-22 18:37:40.137154198 +0000 UTC m=+55.844376608" observedRunningTime="2026-04-22 18:37:42.27844128 +0000 UTC m=+57.985663707" watchObservedRunningTime="2026-04-22 18:37:42.280304154 +0000 UTC m=+57.987526584" Apr 22 18:37:43.421236 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.421199 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6db48549fc-qc97q"] Apr 22 18:37:43.457351 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.457307 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c5fbcf77b-x29r4"] Apr 22 18:37:43.482269 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.482235 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c5fbcf77b-x29r4"] Apr 22 18:37:43.482445 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.482394 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.550663 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.550627 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-oauth-serving-cert\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.550854 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.550677 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-service-ca\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.550854 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.550799 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/23b12de7-c42d-44a6-bf65-2934fd21ff40-console-oauth-config\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.550854 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.550836 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-console-config\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.550854 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.550854 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-trusted-ca-bundle\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.551110 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.550894 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6rrw\" (UniqueName: \"kubernetes.io/projected/23b12de7-c42d-44a6-bf65-2934fd21ff40-kube-api-access-x6rrw\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.551110 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.551048 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/23b12de7-c42d-44a6-bf65-2934fd21ff40-console-serving-cert\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.652537 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.652499 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/23b12de7-c42d-44a6-bf65-2934fd21ff40-console-serving-cert\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.652697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.652556 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-oauth-serving-cert\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.652697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.652579 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-service-ca\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.652697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.652625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/23b12de7-c42d-44a6-bf65-2934fd21ff40-console-oauth-config\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.652697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.652641 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-console-config\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.652697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.652657 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-trusted-ca-bundle\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.652697 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.652685 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6rrw\" (UniqueName: \"kubernetes.io/projected/23b12de7-c42d-44a6-bf65-2934fd21ff40-kube-api-access-x6rrw\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.654197 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.654158 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-oauth-serving-cert\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.654975 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.654951 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-service-ca\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.655365 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.655343 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-console-config\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.656032 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.656011 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-trusted-ca-bundle\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.656225 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.656204 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/23b12de7-c42d-44a6-bf65-2934fd21ff40-console-oauth-config\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.660596 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.660573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/23b12de7-c42d-44a6-bf65-2934fd21ff40-console-serving-cert\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.663499 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.663435 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6rrw\" (UniqueName: \"kubernetes.io/projected/23b12de7-c42d-44a6-bf65-2934fd21ff40-kube-api-access-x6rrw\") pod \"console-c5fbcf77b-x29r4\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:43.792486 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:43.792395 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:44.110424 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:44.110328 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zbqgd" Apr 22 18:37:45.110850 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:45.110815 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c5fbcf77b-x29r4"] Apr 22 18:37:45.114494 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:37:45.114460 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23b12de7_c42d_44a6_bf65_2934fd21ff40.slice/crio-17aca6c89e9b2b70a9fbe43e96a06357ff28e3b9f1f50a37dbe9e02beae951f1 WatchSource:0}: Error finding container 17aca6c89e9b2b70a9fbe43e96a06357ff28e3b9f1f50a37dbe9e02beae951f1: Status 404 returned error can't find the container with id 17aca6c89e9b2b70a9fbe43e96a06357ff28e3b9f1f50a37dbe9e02beae951f1 Apr 22 18:37:45.241768 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:45.241727 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" event={"ID":"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909","Type":"ContainerStarted","Data":"9b31648736e71ac8d38ea1d9376f0a9e573a561e840886d6bc338b6af4270908"} Apr 22 18:37:45.241768 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:45.241776 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" event={"ID":"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909","Type":"ContainerStarted","Data":"c686ac6c9d624b5b4749d30eb9f7fac473696275bdb5096621fbd82b2efd9515"} Apr 22 18:37:45.245981 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:45.245950 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c70d334e-87f8-4f5e-a03b-6bd1312b7570","Type":"ContainerStarted","Data":"ea559e6231658cccaffbfb4ff144ea2e47c63da27d755d7201b36a2e5dd7e5bc"} Apr 22 18:37:45.247512 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:45.247488 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" event={"ID":"e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd","Type":"ContainerStarted","Data":"8f3a1fba747af61a821e47610d63ac0bcba4ebc60876563c5253f48ed27ad926"} Apr 22 18:37:45.250868 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:45.249883 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cgrz7" event={"ID":"f55f170e-96d3-496c-8e6a-ed27167d9344","Type":"ContainerStarted","Data":"3250d0f5060b16e8ecda011906de558f3aa25a50438d05a4daf41533a1f5b1a9"} Apr 22 18:37:45.250868 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:45.250814 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cgrz7" Apr 22 18:37:45.254020 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:45.253659 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5fbcf77b-x29r4" event={"ID":"23b12de7-c42d-44a6-bf65-2934fd21ff40","Type":"ContainerStarted","Data":"70f4e3ea41ce3048bea9788bb31d42a2fd7c09db5492be12d9d10b0d6c5b512e"} Apr 22 18:37:45.254020 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:45.253697 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5fbcf77b-x29r4" event={"ID":"23b12de7-c42d-44a6-bf65-2934fd21ff40","Type":"ContainerStarted","Data":"17aca6c89e9b2b70a9fbe43e96a06357ff28e3b9f1f50a37dbe9e02beae951f1"} Apr 22 18:37:45.258005 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:45.257876 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cgrz7" Apr 22 18:37:45.282583 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:45.282526 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" podStartSLOduration=1.559874095 podStartE2EDuration="5.282507328s" podCreationTimestamp="2026-04-22 18:37:40 +0000 UTC" firstStartedPulling="2026-04-22 18:37:41.194525338 +0000 UTC m=+56.901747748" lastFinishedPulling="2026-04-22 18:37:44.917158563 +0000 UTC m=+60.624380981" observedRunningTime="2026-04-22 18:37:45.282216206 +0000 UTC m=+60.989438633" watchObservedRunningTime="2026-04-22 18:37:45.282507328 +0000 UTC m=+60.989729757" Apr 22 18:37:45.314501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:45.314447 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cgrz7" podStartSLOduration=2.254817835 podStartE2EDuration="5.314427516s" podCreationTimestamp="2026-04-22 18:37:40 +0000 UTC" firstStartedPulling="2026-04-22 18:37:41.897900245 +0000 UTC m=+57.605122656" lastFinishedPulling="2026-04-22 18:37:44.95750993 +0000 UTC m=+60.664732337" observedRunningTime="2026-04-22 18:37:45.313392935 +0000 UTC m=+61.020615362" watchObservedRunningTime="2026-04-22 18:37:45.314427516 +0000 UTC m=+61.021649946" Apr 22 18:37:45.363082 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:45.362871 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c5fbcf77b-x29r4" podStartSLOduration=2.362855131 podStartE2EDuration="2.362855131s" podCreationTimestamp="2026-04-22 18:37:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:37:45.362487047 +0000 UTC m=+61.069709472" watchObservedRunningTime="2026-04-22 18:37:45.362855131 +0000 UTC m=+61.070077560" Apr 22 18:37:46.260223 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:46.260183 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" event={"ID":"3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909","Type":"ContainerStarted","Data":"d683a7a093f74107730a7f14427d05af5bb31edc8e2710d5c358b6d8b694c34c"} Apr 22 18:37:46.260641 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:46.260488 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:46.263457 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:46.263411 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c70d334e-87f8-4f5e-a03b-6bd1312b7570","Type":"ContainerStarted","Data":"423726b210adb72612c93be00e389205e796c5642410bd4da7bbdeb760d54ba5"} Apr 22 18:37:46.263567 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:46.263465 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c70d334e-87f8-4f5e-a03b-6bd1312b7570","Type":"ContainerStarted","Data":"1b9477702b20935206864707b7d183426eb27c75729aab820fd18ea1357b5f74"} Apr 22 18:37:46.263567 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:46.263479 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c70d334e-87f8-4f5e-a03b-6bd1312b7570","Type":"ContainerStarted","Data":"063dd03a4920d53a3987a0c6da632e44ab0283572c38871b531bff5eeacbefb9"} Apr 22 18:37:46.263567 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:46.263491 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c70d334e-87f8-4f5e-a03b-6bd1312b7570","Type":"ContainerStarted","Data":"0e27aa4d26f8b1ce536b81df288b834df9d967e95cc37e4720ee19ad3a3699c2"} Apr 22 18:37:46.263567 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:46.263504 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c70d334e-87f8-4f5e-a03b-6bd1312b7570","Type":"ContainerStarted","Data":"b7911783b66538b911181a6c1557f4067d7c587c7b0b38e6523f303bc3f8fa60"} Apr 22 18:37:46.266443 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:46.266424 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" Apr 22 18:37:46.289042 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:46.288988 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7ffccd8cb5-bkzp6" podStartSLOduration=2.131860433 podStartE2EDuration="8.288974309s" podCreationTimestamp="2026-04-22 18:37:38 +0000 UTC" firstStartedPulling="2026-04-22 18:37:38.760057349 +0000 UTC m=+54.467279754" lastFinishedPulling="2026-04-22 18:37:44.91717122 +0000 UTC m=+60.624393630" observedRunningTime="2026-04-22 18:37:46.28749016 +0000 UTC m=+61.994712587" watchObservedRunningTime="2026-04-22 18:37:46.288974309 +0000 UTC m=+61.996196731" Apr 22 18:37:46.318845 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:46.318792 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.059133702 podStartE2EDuration="9.318776392s" podCreationTimestamp="2026-04-22 18:37:37 +0000 UTC" firstStartedPulling="2026-04-22 18:37:38.701241744 +0000 UTC m=+54.408464151" lastFinishedPulling="2026-04-22 18:37:44.960884428 +0000 UTC m=+60.668106841" observedRunningTime="2026-04-22 18:37:46.317242183 +0000 UTC m=+62.024464619" watchObservedRunningTime="2026-04-22 18:37:46.318776392 +0000 UTC m=+62.025998819" Apr 22 18:37:47.344684 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:47.344642 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:37:50.619009 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:50.618961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ms7\" (UniqueName: \"kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7\") pod \"network-check-target-xg9qw\" (UID: \"5664213f-7ddf-424a-af58-77eee7d05998\") " pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:50.619564 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:50.619030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs\") pod \"network-metrics-daemon-sx6lq\" (UID: \"261a5574-d8e0-4ff1-99d3-ac73ec229f92\") " pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:50.622124 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:50.622102 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:37:50.623185 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:50.623166 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:37:50.631845 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:50.631820 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/261a5574-d8e0-4ff1-99d3-ac73ec229f92-metrics-certs\") pod \"network-metrics-daemon-sx6lq\" (UID: \"261a5574-d8e0-4ff1-99d3-ac73ec229f92\") " pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:50.632498 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:50.632480 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:37:50.642722 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:50.642690 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6ms7\" (UniqueName: \"kubernetes.io/projected/5664213f-7ddf-424a-af58-77eee7d05998-kube-api-access-p6ms7\") pod \"network-check-target-xg9qw\" (UID: \"5664213f-7ddf-424a-af58-77eee7d05998\") " pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:50.743827 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:50.743791 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-zwsxn\"" Apr 22 18:37:50.747366 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:50.747340 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-95dwn\"" Apr 22 18:37:50.751657 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:50.751637 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:50.755399 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:50.755374 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx6lq" Apr 22 18:37:50.900905 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:50.900876 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sx6lq"] Apr 22 18:37:50.904358 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:37:50.904329 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod261a5574_d8e0_4ff1_99d3_ac73ec229f92.slice/crio-324277e5deb878b8aafd883108e65b235d21b63bd31016ca9ff804f692787702 WatchSource:0}: Error finding container 324277e5deb878b8aafd883108e65b235d21b63bd31016ca9ff804f692787702: Status 404 returned error can't find the container with id 324277e5deb878b8aafd883108e65b235d21b63bd31016ca9ff804f692787702 Apr 22 18:37:50.922239 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:50.922210 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xg9qw"] Apr 22 18:37:50.924503 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:37:50.924478 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5664213f_7ddf_424a_af58_77eee7d05998.slice/crio-2536d2764a8c965df402b726207a98d3deefb2c0c494ce407423800c3bbf396e WatchSource:0}: Error finding container 2536d2764a8c965df402b726207a98d3deefb2c0c494ce407423800c3bbf396e: Status 404 returned error can't find the container with id 2536d2764a8c965df402b726207a98d3deefb2c0c494ce407423800c3bbf396e Apr 22 18:37:51.283111 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:51.283072 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sx6lq" event={"ID":"261a5574-d8e0-4ff1-99d3-ac73ec229f92","Type":"ContainerStarted","Data":"324277e5deb878b8aafd883108e65b235d21b63bd31016ca9ff804f692787702"} Apr 22 18:37:51.283996 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:51.283974 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xg9qw" event={"ID":"5664213f-7ddf-424a-af58-77eee7d05998","Type":"ContainerStarted","Data":"2536d2764a8c965df402b726207a98d3deefb2c0c494ce407423800c3bbf396e"} Apr 22 18:37:52.289272 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:52.289236 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sx6lq" event={"ID":"261a5574-d8e0-4ff1-99d3-ac73ec229f92","Type":"ContainerStarted","Data":"5a634819f61d64257f1863f38e10451afb2fb3a11af2d3a2d6507fe698efca6b"} Apr 22 18:37:52.289272 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:52.289278 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sx6lq" event={"ID":"261a5574-d8e0-4ff1-99d3-ac73ec229f92","Type":"ContainerStarted","Data":"801236e7b9a272ec8f52f883907d2fdf3b8bf25ac4a227cd57d453ceff24dd7c"} Apr 22 18:37:52.311049 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:52.310985 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-sx6lq" podStartSLOduration=66.23468573 podStartE2EDuration="1m7.310964199s" podCreationTimestamp="2026-04-22 18:36:45 +0000 UTC" firstStartedPulling="2026-04-22 18:37:50.906420479 +0000 UTC m=+66.613642888" lastFinishedPulling="2026-04-22 18:37:51.982698945 +0000 UTC m=+67.689921357" observedRunningTime="2026-04-22 18:37:52.307537235 +0000 UTC m=+68.014759880" watchObservedRunningTime="2026-04-22 18:37:52.310964199 +0000 UTC m=+68.018186631" Apr 22 18:37:53.793435 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:53.793395 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:53.793892 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:53.793454 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:53.798235 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:53.798212 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:54.297995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:54.297954 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xg9qw" event={"ID":"5664213f-7ddf-424a-af58-77eee7d05998","Type":"ContainerStarted","Data":"2629e8da3aec92ae398201f5715f200d531feb4e8673af90b74a2ae18ba6abee"} Apr 22 18:37:54.298387 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:54.298367 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:37:54.301867 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:54.301847 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:37:54.322605 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:54.322555 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xg9qw" podStartSLOduration=66.535646929 podStartE2EDuration="1m9.322540927s" podCreationTimestamp="2026-04-22 18:36:45 +0000 UTC" firstStartedPulling="2026-04-22 18:37:50.926378815 +0000 UTC m=+66.633601220" lastFinishedPulling="2026-04-22 18:37:53.713272812 +0000 UTC m=+69.420495218" observedRunningTime="2026-04-22 18:37:54.321469442 +0000 UTC m=+70.028691888" watchObservedRunningTime="2026-04-22 18:37:54.322540927 +0000 UTC m=+70.029763355" Apr 22 18:37:54.384501 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:37:54.384463 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c7797494-lgnnv"] Apr 22 18:38:00.800520 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:00.800482 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:38:00.800520 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:00.800527 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:38:08.455564 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.455493 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6db48549fc-qc97q" podUID="955cdd70-155a-4867-a898-966f84075a1a" containerName="console" containerID="cri-o://349588bce8a426dc5382a46360cdfb1dcb2677bdc59eb04912eecb17544e838d" gracePeriod=15 Apr 22 18:38:08.701740 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.701710 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6db48549fc-qc97q_955cdd70-155a-4867-a898-966f84075a1a/console/0.log" Apr 22 18:38:08.701878 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.701787 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:38:08.780871 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.780777 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/955cdd70-155a-4867-a898-966f84075a1a-console-oauth-config\") pod \"955cdd70-155a-4867-a898-966f84075a1a\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " Apr 22 18:38:08.780871 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.780851 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-oauth-serving-cert\") pod \"955cdd70-155a-4867-a898-966f84075a1a\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " Apr 22 18:38:08.781119 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.780877 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb2jn\" (UniqueName: \"kubernetes.io/projected/955cdd70-155a-4867-a898-966f84075a1a-kube-api-access-tb2jn\") pod \"955cdd70-155a-4867-a898-966f84075a1a\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " Apr 22 18:38:08.781119 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.780911 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-console-config\") pod \"955cdd70-155a-4867-a898-966f84075a1a\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " Apr 22 18:38:08.781119 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.780963 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-trusted-ca-bundle\") pod \"955cdd70-155a-4867-a898-966f84075a1a\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " Apr 22 18:38:08.781119 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.781096 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/955cdd70-155a-4867-a898-966f84075a1a-console-serving-cert\") pod \"955cdd70-155a-4867-a898-966f84075a1a\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " Apr 22 18:38:08.781314 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.781198 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-service-ca\") pod \"955cdd70-155a-4867-a898-966f84075a1a\" (UID: \"955cdd70-155a-4867-a898-966f84075a1a\") " Apr 22 18:38:08.781408 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.781385 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "955cdd70-155a-4867-a898-966f84075a1a" (UID: "955cdd70-155a-4867-a898-966f84075a1a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:38:08.781462 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.781401 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-console-config" (OuterVolumeSpecName: "console-config") pod "955cdd70-155a-4867-a898-966f84075a1a" (UID: "955cdd70-155a-4867-a898-966f84075a1a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:38:08.781462 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.781404 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "955cdd70-155a-4867-a898-966f84075a1a" (UID: "955cdd70-155a-4867-a898-966f84075a1a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:38:08.781566 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.781502 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-oauth-serving-cert\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:38:08.781687 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.781633 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-console-config\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:38:08.781687 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.781655 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-trusted-ca-bundle\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:38:08.781687 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.781699 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-service-ca" (OuterVolumeSpecName: "service-ca") pod "955cdd70-155a-4867-a898-966f84075a1a" (UID: "955cdd70-155a-4867-a898-966f84075a1a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:38:08.783427 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.783386 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/955cdd70-155a-4867-a898-966f84075a1a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "955cdd70-155a-4867-a898-966f84075a1a" (UID: "955cdd70-155a-4867-a898-966f84075a1a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:38:08.783427 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.783417 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/955cdd70-155a-4867-a898-966f84075a1a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "955cdd70-155a-4867-a898-966f84075a1a" (UID: "955cdd70-155a-4867-a898-966f84075a1a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:38:08.783427 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.783393 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/955cdd70-155a-4867-a898-966f84075a1a-kube-api-access-tb2jn" (OuterVolumeSpecName: "kube-api-access-tb2jn") pod "955cdd70-155a-4867-a898-966f84075a1a" (UID: "955cdd70-155a-4867-a898-966f84075a1a"). InnerVolumeSpecName "kube-api-access-tb2jn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:38:08.882260 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.882207 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/955cdd70-155a-4867-a898-966f84075a1a-service-ca\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:38:08.882260 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.882252 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/955cdd70-155a-4867-a898-966f84075a1a-console-oauth-config\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:38:08.882260 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.882263 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tb2jn\" (UniqueName: \"kubernetes.io/projected/955cdd70-155a-4867-a898-966f84075a1a-kube-api-access-tb2jn\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:38:08.882260 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:08.882273 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/955cdd70-155a-4867-a898-966f84075a1a-console-serving-cert\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:38:09.348279 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:09.348244 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6db48549fc-qc97q_955cdd70-155a-4867-a898-966f84075a1a/console/0.log" Apr 22 18:38:09.348472 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:09.348289 2575 generic.go:358] "Generic (PLEG): container finished" podID="955cdd70-155a-4867-a898-966f84075a1a" containerID="349588bce8a426dc5382a46360cdfb1dcb2677bdc59eb04912eecb17544e838d" exitCode=2 Apr 22 18:38:09.348472 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:09.348357 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6db48549fc-qc97q" Apr 22 18:38:09.348472 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:09.348373 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db48549fc-qc97q" event={"ID":"955cdd70-155a-4867-a898-966f84075a1a","Type":"ContainerDied","Data":"349588bce8a426dc5382a46360cdfb1dcb2677bdc59eb04912eecb17544e838d"} Apr 22 18:38:09.348472 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:09.348407 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6db48549fc-qc97q" event={"ID":"955cdd70-155a-4867-a898-966f84075a1a","Type":"ContainerDied","Data":"c51f49647453324943ed39a8a5202d0f3e5dad7c1da4ae6c7ec3c7a02de8d320"} Apr 22 18:38:09.348472 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:09.348426 2575 scope.go:117] "RemoveContainer" containerID="349588bce8a426dc5382a46360cdfb1dcb2677bdc59eb04912eecb17544e838d" Apr 22 18:38:09.356560 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:09.356540 2575 scope.go:117] "RemoveContainer" containerID="349588bce8a426dc5382a46360cdfb1dcb2677bdc59eb04912eecb17544e838d" Apr 22 18:38:09.356847 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:38:09.356820 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"349588bce8a426dc5382a46360cdfb1dcb2677bdc59eb04912eecb17544e838d\": container with ID starting with 349588bce8a426dc5382a46360cdfb1dcb2677bdc59eb04912eecb17544e838d not found: ID does not exist" containerID="349588bce8a426dc5382a46360cdfb1dcb2677bdc59eb04912eecb17544e838d" Apr 22 18:38:09.356964 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:09.356853 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"349588bce8a426dc5382a46360cdfb1dcb2677bdc59eb04912eecb17544e838d"} err="failed to get container status \"349588bce8a426dc5382a46360cdfb1dcb2677bdc59eb04912eecb17544e838d\": rpc error: code = NotFound desc = could not find container \"349588bce8a426dc5382a46360cdfb1dcb2677bdc59eb04912eecb17544e838d\": container with ID starting with 349588bce8a426dc5382a46360cdfb1dcb2677bdc59eb04912eecb17544e838d not found: ID does not exist" Apr 22 18:38:09.373188 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:09.373147 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6db48549fc-qc97q"] Apr 22 18:38:09.375649 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:09.375619 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6db48549fc-qc97q"] Apr 22 18:38:10.933655 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:10.933614 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="955cdd70-155a-4867-a898-966f84075a1a" path="/var/lib/kubelet/pods/955cdd70-155a-4867-a898-966f84075a1a/volumes" Apr 22 18:38:19.404545 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.404484 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c7797494-lgnnv" podUID="a76ee650-7c49-4aeb-9e0e-d795265c262a" containerName="console" containerID="cri-o://ef9413686857fd58ee35e9e92121f08adf26588f937f3d6c8ca74685832446c8" gracePeriod=15 Apr 22 18:38:19.636719 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.636686 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c7797494-lgnnv_a76ee650-7c49-4aeb-9e0e-d795265c262a/console/0.log" Apr 22 18:38:19.636852 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.636747 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:38:19.776083 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.775969 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nzz2\" (UniqueName: \"kubernetes.io/projected/a76ee650-7c49-4aeb-9e0e-d795265c262a-kube-api-access-7nzz2\") pod \"a76ee650-7c49-4aeb-9e0e-d795265c262a\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " Apr 22 18:38:19.776083 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.776025 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a76ee650-7c49-4aeb-9e0e-d795265c262a-console-config\") pod \"a76ee650-7c49-4aeb-9e0e-d795265c262a\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " Apr 22 18:38:19.776083 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.776054 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a76ee650-7c49-4aeb-9e0e-d795265c262a-console-serving-cert\") pod \"a76ee650-7c49-4aeb-9e0e-d795265c262a\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " Apr 22 18:38:19.776083 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.776095 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a76ee650-7c49-4aeb-9e0e-d795265c262a-service-ca\") pod \"a76ee650-7c49-4aeb-9e0e-d795265c262a\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " Apr 22 18:38:19.776418 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.776116 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a76ee650-7c49-4aeb-9e0e-d795265c262a-oauth-serving-cert\") pod \"a76ee650-7c49-4aeb-9e0e-d795265c262a\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " Apr 22 18:38:19.776418 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.776152 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a76ee650-7c49-4aeb-9e0e-d795265c262a-console-oauth-config\") pod \"a76ee650-7c49-4aeb-9e0e-d795265c262a\" (UID: \"a76ee650-7c49-4aeb-9e0e-d795265c262a\") " Apr 22 18:38:19.776566 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.776542 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a76ee650-7c49-4aeb-9e0e-d795265c262a-console-config" (OuterVolumeSpecName: "console-config") pod "a76ee650-7c49-4aeb-9e0e-d795265c262a" (UID: "a76ee650-7c49-4aeb-9e0e-d795265c262a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:38:19.776621 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.776561 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a76ee650-7c49-4aeb-9e0e-d795265c262a-service-ca" (OuterVolumeSpecName: "service-ca") pod "a76ee650-7c49-4aeb-9e0e-d795265c262a" (UID: "a76ee650-7c49-4aeb-9e0e-d795265c262a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:38:19.776621 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.776570 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a76ee650-7c49-4aeb-9e0e-d795265c262a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a76ee650-7c49-4aeb-9e0e-d795265c262a" (UID: "a76ee650-7c49-4aeb-9e0e-d795265c262a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:38:19.778446 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.778407 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a76ee650-7c49-4aeb-9e0e-d795265c262a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a76ee650-7c49-4aeb-9e0e-d795265c262a" (UID: "a76ee650-7c49-4aeb-9e0e-d795265c262a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:38:19.778549 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.778458 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a76ee650-7c49-4aeb-9e0e-d795265c262a-kube-api-access-7nzz2" (OuterVolumeSpecName: "kube-api-access-7nzz2") pod "a76ee650-7c49-4aeb-9e0e-d795265c262a" (UID: "a76ee650-7c49-4aeb-9e0e-d795265c262a"). InnerVolumeSpecName "kube-api-access-7nzz2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:38:19.778549 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.778488 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a76ee650-7c49-4aeb-9e0e-d795265c262a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a76ee650-7c49-4aeb-9e0e-d795265c262a" (UID: "a76ee650-7c49-4aeb-9e0e-d795265c262a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:38:19.876740 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.876701 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7nzz2\" (UniqueName: \"kubernetes.io/projected/a76ee650-7c49-4aeb-9e0e-d795265c262a-kube-api-access-7nzz2\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:38:19.876740 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.876732 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a76ee650-7c49-4aeb-9e0e-d795265c262a-console-config\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:38:19.876740 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.876743 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a76ee650-7c49-4aeb-9e0e-d795265c262a-console-serving-cert\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:38:19.876984 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.876752 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a76ee650-7c49-4aeb-9e0e-d795265c262a-service-ca\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:38:19.876984 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.876761 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a76ee650-7c49-4aeb-9e0e-d795265c262a-oauth-serving-cert\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:38:19.876984 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:19.876769 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a76ee650-7c49-4aeb-9e0e-d795265c262a-console-oauth-config\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:38:20.386784 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:20.386748 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c7797494-lgnnv_a76ee650-7c49-4aeb-9e0e-d795265c262a/console/0.log" Apr 22 18:38:20.386981 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:20.386794 2575 generic.go:358] "Generic (PLEG): container finished" podID="a76ee650-7c49-4aeb-9e0e-d795265c262a" containerID="ef9413686857fd58ee35e9e92121f08adf26588f937f3d6c8ca74685832446c8" exitCode=2 Apr 22 18:38:20.386981 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:20.386864 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c7797494-lgnnv" event={"ID":"a76ee650-7c49-4aeb-9e0e-d795265c262a","Type":"ContainerDied","Data":"ef9413686857fd58ee35e9e92121f08adf26588f937f3d6c8ca74685832446c8"} Apr 22 18:38:20.386981 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:20.386897 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c7797494-lgnnv" Apr 22 18:38:20.386981 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:20.386912 2575 scope.go:117] "RemoveContainer" containerID="ef9413686857fd58ee35e9e92121f08adf26588f937f3d6c8ca74685832446c8" Apr 22 18:38:20.387148 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:20.386900 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c7797494-lgnnv" event={"ID":"a76ee650-7c49-4aeb-9e0e-d795265c262a","Type":"ContainerDied","Data":"45ae5b95a5b627270acfee1d5e6aaa9504b22cce89a38c4c743afc5b8e5b7c5a"} Apr 22 18:38:20.395335 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:20.395310 2575 scope.go:117] "RemoveContainer" containerID="ef9413686857fd58ee35e9e92121f08adf26588f937f3d6c8ca74685832446c8" Apr 22 18:38:20.395618 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:38:20.395601 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef9413686857fd58ee35e9e92121f08adf26588f937f3d6c8ca74685832446c8\": container with ID starting with ef9413686857fd58ee35e9e92121f08adf26588f937f3d6c8ca74685832446c8 not found: ID does not exist" containerID="ef9413686857fd58ee35e9e92121f08adf26588f937f3d6c8ca74685832446c8" Apr 22 18:38:20.395670 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:20.395628 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef9413686857fd58ee35e9e92121f08adf26588f937f3d6c8ca74685832446c8"} err="failed to get container status \"ef9413686857fd58ee35e9e92121f08adf26588f937f3d6c8ca74685832446c8\": rpc error: code = NotFound desc = could not find container \"ef9413686857fd58ee35e9e92121f08adf26588f937f3d6c8ca74685832446c8\": container with ID starting with ef9413686857fd58ee35e9e92121f08adf26588f937f3d6c8ca74685832446c8 not found: ID does not exist" Apr 22 18:38:20.410483 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:20.410456 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c7797494-lgnnv"] Apr 22 18:38:20.415388 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:20.415364 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c7797494-lgnnv"] Apr 22 18:38:20.805854 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:20.805825 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:38:20.809867 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:20.809841 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7df4869b5b-rkw66" Apr 22 18:38:20.933336 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:20.933302 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a76ee650-7c49-4aeb-9e0e-d795265c262a" path="/var/lib/kubelet/pods/a76ee650-7c49-4aeb-9e0e-d795265c262a/volumes" Apr 22 18:38:25.303874 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:38:25.303845 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xg9qw" Apr 22 18:39:00.600116 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.600073 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6"] Apr 22 18:39:00.600641 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.600414 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="955cdd70-155a-4867-a898-966f84075a1a" containerName="console" Apr 22 18:39:00.600641 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.600427 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="955cdd70-155a-4867-a898-966f84075a1a" containerName="console" Apr 22 18:39:00.600641 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.600438 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a76ee650-7c49-4aeb-9e0e-d795265c262a" containerName="console" Apr 22 18:39:00.600641 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.600443 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76ee650-7c49-4aeb-9e0e-d795265c262a" containerName="console" Apr 22 18:39:00.600641 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.600535 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a76ee650-7c49-4aeb-9e0e-d795265c262a" containerName="console" Apr 22 18:39:00.600641 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.600545 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="955cdd70-155a-4867-a898-966f84075a1a" containerName="console" Apr 22 18:39:00.603654 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.603632 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.606716 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.606693 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 18:39:00.607329 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.607291 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 18:39:00.607329 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.607314 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 18:39:00.607515 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.607345 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 18:39:00.607515 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.607388 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-55c2c\"" Apr 22 18:39:00.607652 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.607625 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 18:39:00.613262 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.612431 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 18:39:00.642828 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.642797 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6"] Apr 22 18:39:00.728478 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.728428 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95404392-caeb-4280-9b51-cec352e540b0-metrics-client-ca\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.728669 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.728487 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/95404392-caeb-4280-9b51-cec352e540b0-telemeter-client-tls\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.728669 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.728536 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/95404392-caeb-4280-9b51-cec352e540b0-federate-client-tls\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.728669 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.728560 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stzd8\" (UniqueName: \"kubernetes.io/projected/95404392-caeb-4280-9b51-cec352e540b0-kube-api-access-stzd8\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.728669 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.728615 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95404392-caeb-4280-9b51-cec352e540b0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.728840 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.728702 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/95404392-caeb-4280-9b51-cec352e540b0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.728840 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.728738 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95404392-caeb-4280-9b51-cec352e540b0-serving-certs-ca-bundle\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.728840 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.728759 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/95404392-caeb-4280-9b51-cec352e540b0-secret-telemeter-client\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.829968 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.829900 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95404392-caeb-4280-9b51-cec352e540b0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.830120 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.829990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/95404392-caeb-4280-9b51-cec352e540b0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.830120 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.830016 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95404392-caeb-4280-9b51-cec352e540b0-serving-certs-ca-bundle\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.830120 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.830033 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/95404392-caeb-4280-9b51-cec352e540b0-secret-telemeter-client\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.830120 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.830054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95404392-caeb-4280-9b51-cec352e540b0-metrics-client-ca\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.830120 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.830087 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/95404392-caeb-4280-9b51-cec352e540b0-telemeter-client-tls\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.830390 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.830126 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/95404392-caeb-4280-9b51-cec352e540b0-federate-client-tls\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.830390 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.830243 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stzd8\" (UniqueName: \"kubernetes.io/projected/95404392-caeb-4280-9b51-cec352e540b0-kube-api-access-stzd8\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.830845 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.830813 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95404392-caeb-4280-9b51-cec352e540b0-serving-certs-ca-bundle\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.831140 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.831121 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95404392-caeb-4280-9b51-cec352e540b0-metrics-client-ca\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.831213 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.831197 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95404392-caeb-4280-9b51-cec352e540b0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.832861 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.832830 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/95404392-caeb-4280-9b51-cec352e540b0-telemeter-client-tls\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.832976 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.832899 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/95404392-caeb-4280-9b51-cec352e540b0-secret-telemeter-client\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.833120 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.833103 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/95404392-caeb-4280-9b51-cec352e540b0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.833209 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.833190 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/95404392-caeb-4280-9b51-cec352e540b0-federate-client-tls\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.840576 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.840556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stzd8\" (UniqueName: \"kubernetes.io/projected/95404392-caeb-4280-9b51-cec352e540b0-kube-api-access-stzd8\") pod \"telemeter-client-5d8cf9c78b-v9ss6\" (UID: \"95404392-caeb-4280-9b51-cec352e540b0\") " pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:00.916920 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:00.916882 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" Apr 22 18:39:01.051355 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:01.051321 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6"] Apr 22 18:39:01.055028 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:39:01.054995 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95404392_caeb_4280_9b51_cec352e540b0.slice/crio-4184b212627e42fbb12306a6fdb168b17495be59de5b9df52d5e0ff91f60788d WatchSource:0}: Error finding container 4184b212627e42fbb12306a6fdb168b17495be59de5b9df52d5e0ff91f60788d: Status 404 returned error can't find the container with id 4184b212627e42fbb12306a6fdb168b17495be59de5b9df52d5e0ff91f60788d Apr 22 18:39:01.515658 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:01.515614 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" event={"ID":"95404392-caeb-4280-9b51-cec352e540b0","Type":"ContainerStarted","Data":"4184b212627e42fbb12306a6fdb168b17495be59de5b9df52d5e0ff91f60788d"} Apr 22 18:39:03.523995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:03.523952 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" event={"ID":"95404392-caeb-4280-9b51-cec352e540b0","Type":"ContainerStarted","Data":"fa8d4e0dd6b16b5cdc9e9f69488bba3050642678c67e9d0ae8143297c2e2c739"} Apr 22 18:39:03.523995 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:03.523996 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" event={"ID":"95404392-caeb-4280-9b51-cec352e540b0","Type":"ContainerStarted","Data":"a81befe56e77c9d47dab4a1be4948cbadb036dd227dd0c469b0ad4769ec4c8e7"} Apr 22 18:39:03.524417 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:03.524005 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" event={"ID":"95404392-caeb-4280-9b51-cec352e540b0","Type":"ContainerStarted","Data":"9dea8153da459f3e865e640b5a7ea1b00493df00317557f6594ff4c447ef0502"} Apr 22 18:39:03.555949 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:03.555875 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5d8cf9c78b-v9ss6" podStartSLOduration=1.60843209 podStartE2EDuration="3.555857972s" podCreationTimestamp="2026-04-22 18:39:00 +0000 UTC" firstStartedPulling="2026-04-22 18:39:01.056964098 +0000 UTC m=+136.764186504" lastFinishedPulling="2026-04-22 18:39:03.004389977 +0000 UTC m=+138.711612386" observedRunningTime="2026-04-22 18:39:03.554424446 +0000 UTC m=+139.261646885" watchObservedRunningTime="2026-04-22 18:39:03.555857972 +0000 UTC m=+139.263080400" Apr 22 18:39:04.215691 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.215654 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f8c6fbfb6-vr9rj"] Apr 22 18:39:04.219005 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.218985 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.231331 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.231302 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f8c6fbfb6-vr9rj"] Apr 22 18:39:04.362347 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.362300 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-console-config\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.362531 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.362353 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-oauth-serving-cert\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.362531 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.362464 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-console-serving-cert\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.362531 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.362490 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-trusted-ca-bundle\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.362531 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.362517 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlbb8\" (UniqueName: \"kubernetes.io/projected/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-kube-api-access-qlbb8\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.362693 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.362552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-console-oauth-config\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.362693 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.362594 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-service-ca\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.463641 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.463604 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-console-config\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.463641 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.463649 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-oauth-serving-cert\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.463890 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.463694 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-console-serving-cert\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.463890 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.463711 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-trusted-ca-bundle\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.463890 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.463739 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qlbb8\" (UniqueName: \"kubernetes.io/projected/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-kube-api-access-qlbb8\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.463890 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.463768 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-console-oauth-config\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.464113 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.463982 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-service-ca\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.464417 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.464391 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-console-config\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.464689 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.464448 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-oauth-serving-cert\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.464689 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.464614 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-trusted-ca-bundle\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.464689 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.464649 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-service-ca\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.466406 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.466356 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-console-serving-cert\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.466481 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.466409 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-console-oauth-config\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.476690 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.476662 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlbb8\" (UniqueName: \"kubernetes.io/projected/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-kube-api-access-qlbb8\") pod \"console-6f8c6fbfb6-vr9rj\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.528048 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.528019 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:04.688979 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:04.688901 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f8c6fbfb6-vr9rj"] Apr 22 18:39:04.693193 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:39:04.693165 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb29951f7_1c0b_47f1_9013_cc6997ecbc4b.slice/crio-cc9cbcfe9b8a016872e49c01bc04cb154dd783f7450b98bb91714f3fdb6d559e WatchSource:0}: Error finding container cc9cbcfe9b8a016872e49c01bc04cb154dd783f7450b98bb91714f3fdb6d559e: Status 404 returned error can't find the container with id cc9cbcfe9b8a016872e49c01bc04cb154dd783f7450b98bb91714f3fdb6d559e Apr 22 18:39:05.530943 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:05.530886 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f8c6fbfb6-vr9rj" event={"ID":"b29951f7-1c0b-47f1-9013-cc6997ecbc4b","Type":"ContainerStarted","Data":"ec9a1c428fc0db567d8a537b71a3673cd676975ee99ac12e8d76184bf822d61d"} Apr 22 18:39:05.530943 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:05.530944 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f8c6fbfb6-vr9rj" event={"ID":"b29951f7-1c0b-47f1-9013-cc6997ecbc4b","Type":"ContainerStarted","Data":"cc9cbcfe9b8a016872e49c01bc04cb154dd783f7450b98bb91714f3fdb6d559e"} Apr 22 18:39:05.551397 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:05.551346 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f8c6fbfb6-vr9rj" podStartSLOduration=1.5513304319999999 podStartE2EDuration="1.551330432s" podCreationTimestamp="2026-04-22 18:39:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:39:05.551079474 +0000 UTC m=+141.258301903" watchObservedRunningTime="2026-04-22 18:39:05.551330432 +0000 UTC m=+141.258552860" Apr 22 18:39:14.528543 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:14.528499 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:14.528543 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:14.528551 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:14.533369 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:14.533341 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:14.560579 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:14.560545 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:14.630446 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:14.630415 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c5fbcf77b-x29r4"] Apr 22 18:39:14.771468 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:14.771428 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-bcbcc4984-647zw"] Apr 22 18:39:14.777802 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:14.777769 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:14.787566 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:14.787541 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bcbcc4984-647zw"] Apr 22 18:39:14.955796 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:14.955756 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-console-oauth-config\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:14.955796 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:14.955800 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-oauth-serving-cert\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:14.956034 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:14.955823 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h48lg\" (UniqueName: \"kubernetes.io/projected/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-kube-api-access-h48lg\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:14.956034 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:14.955912 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-console-serving-cert\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:14.956034 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:14.955979 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-console-config\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:14.956034 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:14.956009 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-service-ca\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:14.956163 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:14.956055 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-trusted-ca-bundle\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:15.057522 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.057426 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-trusted-ca-bundle\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:15.057522 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.057472 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-console-oauth-config\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:15.057522 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.057514 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-oauth-serving-cert\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:15.057821 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.057542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h48lg\" (UniqueName: \"kubernetes.io/projected/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-kube-api-access-h48lg\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:15.057821 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.057601 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-console-serving-cert\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:15.057821 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.057648 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-console-config\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:15.057821 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.057690 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-service-ca\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:15.058434 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.058399 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-oauth-serving-cert\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:15.058562 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.058401 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-trusted-ca-bundle\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:15.058562 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.058476 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-service-ca\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:15.059022 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.058996 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-console-config\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:15.060448 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.060429 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-console-serving-cert\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:15.060713 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.060690 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-console-oauth-config\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:15.069017 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.068986 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h48lg\" (UniqueName: \"kubernetes.io/projected/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-kube-api-access-h48lg\") pod \"console-bcbcc4984-647zw\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:15.088741 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.088708 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:15.220210 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.220172 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bcbcc4984-647zw"] Apr 22 18:39:15.223588 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:39:15.223560 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod829141e9_7509_4f9e_aca6_6fe11a0ed5b0.slice/crio-d9bd508d8d3e3d3c7dbe4640d07d5ad206971626e44031b89f0aa0732d1badca WatchSource:0}: Error finding container d9bd508d8d3e3d3c7dbe4640d07d5ad206971626e44031b89f0aa0732d1badca: Status 404 returned error can't find the container with id d9bd508d8d3e3d3c7dbe4640d07d5ad206971626e44031b89f0aa0732d1badca Apr 22 18:39:15.560460 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.560418 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bcbcc4984-647zw" event={"ID":"829141e9-7509-4f9e-aca6-6fe11a0ed5b0","Type":"ContainerStarted","Data":"39bf8fb7a1f686affdc861b87a7c2b18a1e80ed7af8a056ea999f23b50409de6"} Apr 22 18:39:15.560460 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.560467 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bcbcc4984-647zw" event={"ID":"829141e9-7509-4f9e-aca6-6fe11a0ed5b0","Type":"ContainerStarted","Data":"d9bd508d8d3e3d3c7dbe4640d07d5ad206971626e44031b89f0aa0732d1badca"} Apr 22 18:39:15.582804 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:15.582750 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bcbcc4984-647zw" podStartSLOduration=1.582734519 podStartE2EDuration="1.582734519s" podCreationTimestamp="2026-04-22 18:39:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:39:15.581738115 +0000 UTC m=+151.288960538" watchObservedRunningTime="2026-04-22 18:39:15.582734519 +0000 UTC m=+151.289956946" Apr 22 18:39:25.088885 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:25.088780 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:25.088885 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:25.088832 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:25.093655 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:25.093619 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:25.592606 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:25.592578 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:39:25.646120 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:25.646079 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f8c6fbfb6-vr9rj"] Apr 22 18:39:39.651400 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:39.651332 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-c5fbcf77b-x29r4" podUID="23b12de7-c42d-44a6-bf65-2934fd21ff40" containerName="console" containerID="cri-o://70f4e3ea41ce3048bea9788bb31d42a2fd7c09db5492be12d9d10b0d6c5b512e" gracePeriod=15 Apr 22 18:39:39.888498 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:39.888475 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c5fbcf77b-x29r4_23b12de7-c42d-44a6-bf65-2934fd21ff40/console/0.log" Apr 22 18:39:39.888645 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:39.888539 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:39:39.970818 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:39.970726 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/23b12de7-c42d-44a6-bf65-2934fd21ff40-console-oauth-config\") pod \"23b12de7-c42d-44a6-bf65-2934fd21ff40\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " Apr 22 18:39:39.970818 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:39.970769 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-trusted-ca-bundle\") pod \"23b12de7-c42d-44a6-bf65-2934fd21ff40\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " Apr 22 18:39:39.970818 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:39.970813 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-service-ca\") pod \"23b12de7-c42d-44a6-bf65-2934fd21ff40\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " Apr 22 18:39:39.971098 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:39.970861 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6rrw\" (UniqueName: \"kubernetes.io/projected/23b12de7-c42d-44a6-bf65-2934fd21ff40-kube-api-access-x6rrw\") pod \"23b12de7-c42d-44a6-bf65-2934fd21ff40\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " Apr 22 18:39:39.971098 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:39.970896 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/23b12de7-c42d-44a6-bf65-2934fd21ff40-console-serving-cert\") pod \"23b12de7-c42d-44a6-bf65-2934fd21ff40\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " Apr 22 18:39:39.971098 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:39.970953 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-oauth-serving-cert\") pod \"23b12de7-c42d-44a6-bf65-2934fd21ff40\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " Apr 22 18:39:39.971098 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:39.970988 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-console-config\") pod \"23b12de7-c42d-44a6-bf65-2934fd21ff40\" (UID: \"23b12de7-c42d-44a6-bf65-2934fd21ff40\") " Apr 22 18:39:39.971385 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:39.971353 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "23b12de7-c42d-44a6-bf65-2934fd21ff40" (UID: "23b12de7-c42d-44a6-bf65-2934fd21ff40"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:39:39.971458 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:39.971351 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-service-ca" (OuterVolumeSpecName: "service-ca") pod "23b12de7-c42d-44a6-bf65-2934fd21ff40" (UID: "23b12de7-c42d-44a6-bf65-2934fd21ff40"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:39:39.971458 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:39.971422 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-console-config" (OuterVolumeSpecName: "console-config") pod "23b12de7-c42d-44a6-bf65-2934fd21ff40" (UID: "23b12de7-c42d-44a6-bf65-2934fd21ff40"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:39:39.971458 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:39.971439 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "23b12de7-c42d-44a6-bf65-2934fd21ff40" (UID: "23b12de7-c42d-44a6-bf65-2934fd21ff40"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:39:39.973301 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:39.973272 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b12de7-c42d-44a6-bf65-2934fd21ff40-kube-api-access-x6rrw" (OuterVolumeSpecName: "kube-api-access-x6rrw") pod "23b12de7-c42d-44a6-bf65-2934fd21ff40" (UID: "23b12de7-c42d-44a6-bf65-2934fd21ff40"). InnerVolumeSpecName "kube-api-access-x6rrw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:39:39.973301 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:39.973286 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23b12de7-c42d-44a6-bf65-2934fd21ff40-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "23b12de7-c42d-44a6-bf65-2934fd21ff40" (UID: "23b12de7-c42d-44a6-bf65-2934fd21ff40"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:39:39.973470 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:39.973449 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23b12de7-c42d-44a6-bf65-2934fd21ff40-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "23b12de7-c42d-44a6-bf65-2934fd21ff40" (UID: "23b12de7-c42d-44a6-bf65-2934fd21ff40"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:39:40.072700 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:40.072647 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x6rrw\" (UniqueName: \"kubernetes.io/projected/23b12de7-c42d-44a6-bf65-2934fd21ff40-kube-api-access-x6rrw\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:39:40.072700 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:40.072693 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/23b12de7-c42d-44a6-bf65-2934fd21ff40-console-serving-cert\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:39:40.072700 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:40.072703 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-oauth-serving-cert\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:39:40.072700 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:40.072712 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-console-config\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:39:40.072700 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:40.072723 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/23b12de7-c42d-44a6-bf65-2934fd21ff40-console-oauth-config\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:39:40.073052 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:40.072734 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-trusted-ca-bundle\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:39:40.073052 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:40.072743 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23b12de7-c42d-44a6-bf65-2934fd21ff40-service-ca\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:39:40.633383 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:40.633358 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c5fbcf77b-x29r4_23b12de7-c42d-44a6-bf65-2934fd21ff40/console/0.log" Apr 22 18:39:40.633566 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:40.633395 2575 generic.go:358] "Generic (PLEG): container finished" podID="23b12de7-c42d-44a6-bf65-2934fd21ff40" containerID="70f4e3ea41ce3048bea9788bb31d42a2fd7c09db5492be12d9d10b0d6c5b512e" exitCode=2 Apr 22 18:39:40.633566 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:40.633429 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5fbcf77b-x29r4" event={"ID":"23b12de7-c42d-44a6-bf65-2934fd21ff40","Type":"ContainerDied","Data":"70f4e3ea41ce3048bea9788bb31d42a2fd7c09db5492be12d9d10b0d6c5b512e"} Apr 22 18:39:40.633566 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:40.633452 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5fbcf77b-x29r4" event={"ID":"23b12de7-c42d-44a6-bf65-2934fd21ff40","Type":"ContainerDied","Data":"17aca6c89e9b2b70a9fbe43e96a06357ff28e3b9f1f50a37dbe9e02beae951f1"} Apr 22 18:39:40.633566 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:40.633462 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5fbcf77b-x29r4" Apr 22 18:39:40.633566 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:40.633466 2575 scope.go:117] "RemoveContainer" containerID="70f4e3ea41ce3048bea9788bb31d42a2fd7c09db5492be12d9d10b0d6c5b512e" Apr 22 18:39:40.642270 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:40.642239 2575 scope.go:117] "RemoveContainer" containerID="70f4e3ea41ce3048bea9788bb31d42a2fd7c09db5492be12d9d10b0d6c5b512e" Apr 22 18:39:40.642540 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:39:40.642512 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70f4e3ea41ce3048bea9788bb31d42a2fd7c09db5492be12d9d10b0d6c5b512e\": container with ID starting with 70f4e3ea41ce3048bea9788bb31d42a2fd7c09db5492be12d9d10b0d6c5b512e not found: ID does not exist" containerID="70f4e3ea41ce3048bea9788bb31d42a2fd7c09db5492be12d9d10b0d6c5b512e" Apr 22 18:39:40.642588 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:40.642548 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70f4e3ea41ce3048bea9788bb31d42a2fd7c09db5492be12d9d10b0d6c5b512e"} err="failed to get container status \"70f4e3ea41ce3048bea9788bb31d42a2fd7c09db5492be12d9d10b0d6c5b512e\": rpc error: code = NotFound desc = could not find container \"70f4e3ea41ce3048bea9788bb31d42a2fd7c09db5492be12d9d10b0d6c5b512e\": container with ID starting with 70f4e3ea41ce3048bea9788bb31d42a2fd7c09db5492be12d9d10b0d6c5b512e not found: ID does not exist" Apr 22 18:39:40.656976 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:40.656942 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c5fbcf77b-x29r4"] Apr 22 18:39:40.661010 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:40.660983 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c5fbcf77b-x29r4"] Apr 22 18:39:40.934160 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:40.934122 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b12de7-c42d-44a6-bf65-2934fd21ff40" path="/var/lib/kubelet/pods/23b12de7-c42d-44a6-bf65-2934fd21ff40/volumes" Apr 22 18:39:50.666700 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:50.666654 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6f8c6fbfb6-vr9rj" podUID="b29951f7-1c0b-47f1-9013-cc6997ecbc4b" containerName="console" containerID="cri-o://ec9a1c428fc0db567d8a537b71a3673cd676975ee99ac12e8d76184bf822d61d" gracePeriod=15 Apr 22 18:39:50.915887 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:50.915862 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f8c6fbfb6-vr9rj_b29951f7-1c0b-47f1-9013-cc6997ecbc4b/console/0.log" Apr 22 18:39:50.916043 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:50.915946 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:51.076153 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.076057 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-console-config\") pod \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " Apr 22 18:39:51.076153 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.076101 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-trusted-ca-bundle\") pod \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " Apr 22 18:39:51.076153 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.076137 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-oauth-serving-cert\") pod \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " Apr 22 18:39:51.076436 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.076280 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-console-oauth-config\") pod \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " Apr 22 18:39:51.076436 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.076364 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-service-ca\") pod \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " Apr 22 18:39:51.076436 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.076422 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-console-serving-cert\") pod \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " Apr 22 18:39:51.076581 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.076451 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlbb8\" (UniqueName: \"kubernetes.io/projected/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-kube-api-access-qlbb8\") pod \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\" (UID: \"b29951f7-1c0b-47f1-9013-cc6997ecbc4b\") " Apr 22 18:39:51.076581 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.076513 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b29951f7-1c0b-47f1-9013-cc6997ecbc4b" (UID: "b29951f7-1c0b-47f1-9013-cc6997ecbc4b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:39:51.076581 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.076527 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-console-config" (OuterVolumeSpecName: "console-config") pod "b29951f7-1c0b-47f1-9013-cc6997ecbc4b" (UID: "b29951f7-1c0b-47f1-9013-cc6997ecbc4b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:39:51.076711 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.076687 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b29951f7-1c0b-47f1-9013-cc6997ecbc4b" (UID: "b29951f7-1c0b-47f1-9013-cc6997ecbc4b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:39:51.076811 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.076719 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-console-config\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:39:51.076811 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.076738 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-oauth-serving-cert\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:39:51.076811 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.076758 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-service-ca" (OuterVolumeSpecName: "service-ca") pod "b29951f7-1c0b-47f1-9013-cc6997ecbc4b" (UID: "b29951f7-1c0b-47f1-9013-cc6997ecbc4b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:39:51.078637 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.078609 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b29951f7-1c0b-47f1-9013-cc6997ecbc4b" (UID: "b29951f7-1c0b-47f1-9013-cc6997ecbc4b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:39:51.078759 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.078665 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-kube-api-access-qlbb8" (OuterVolumeSpecName: "kube-api-access-qlbb8") pod "b29951f7-1c0b-47f1-9013-cc6997ecbc4b" (UID: "b29951f7-1c0b-47f1-9013-cc6997ecbc4b"). InnerVolumeSpecName "kube-api-access-qlbb8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:39:51.078759 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.078669 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b29951f7-1c0b-47f1-9013-cc6997ecbc4b" (UID: "b29951f7-1c0b-47f1-9013-cc6997ecbc4b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:39:51.178000 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.177917 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-console-serving-cert\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:39:51.178000 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.177994 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qlbb8\" (UniqueName: \"kubernetes.io/projected/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-kube-api-access-qlbb8\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:39:51.178000 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.178006 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-trusted-ca-bundle\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:39:51.178000 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.178016 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-console-oauth-config\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:39:51.178273 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.178028 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b29951f7-1c0b-47f1-9013-cc6997ecbc4b-service-ca\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:39:51.668963 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.668912 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f8c6fbfb6-vr9rj_b29951f7-1c0b-47f1-9013-cc6997ecbc4b/console/0.log" Apr 22 18:39:51.669353 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.668974 2575 generic.go:358] "Generic (PLEG): container finished" podID="b29951f7-1c0b-47f1-9013-cc6997ecbc4b" containerID="ec9a1c428fc0db567d8a537b71a3673cd676975ee99ac12e8d76184bf822d61d" exitCode=2 Apr 22 18:39:51.669353 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.669017 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f8c6fbfb6-vr9rj" event={"ID":"b29951f7-1c0b-47f1-9013-cc6997ecbc4b","Type":"ContainerDied","Data":"ec9a1c428fc0db567d8a537b71a3673cd676975ee99ac12e8d76184bf822d61d"} Apr 22 18:39:51.669353 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.669043 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f8c6fbfb6-vr9rj" event={"ID":"b29951f7-1c0b-47f1-9013-cc6997ecbc4b","Type":"ContainerDied","Data":"cc9cbcfe9b8a016872e49c01bc04cb154dd783f7450b98bb91714f3fdb6d559e"} Apr 22 18:39:51.669353 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.669057 2575 scope.go:117] "RemoveContainer" containerID="ec9a1c428fc0db567d8a537b71a3673cd676975ee99ac12e8d76184bf822d61d" Apr 22 18:39:51.669353 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.669074 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f8c6fbfb6-vr9rj" Apr 22 18:39:51.677280 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.677259 2575 scope.go:117] "RemoveContainer" containerID="ec9a1c428fc0db567d8a537b71a3673cd676975ee99ac12e8d76184bf822d61d" Apr 22 18:39:51.677568 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:39:51.677549 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec9a1c428fc0db567d8a537b71a3673cd676975ee99ac12e8d76184bf822d61d\": container with ID starting with ec9a1c428fc0db567d8a537b71a3673cd676975ee99ac12e8d76184bf822d61d not found: ID does not exist" containerID="ec9a1c428fc0db567d8a537b71a3673cd676975ee99ac12e8d76184bf822d61d" Apr 22 18:39:51.677613 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.677578 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec9a1c428fc0db567d8a537b71a3673cd676975ee99ac12e8d76184bf822d61d"} err="failed to get container status \"ec9a1c428fc0db567d8a537b71a3673cd676975ee99ac12e8d76184bf822d61d\": rpc error: code = NotFound desc = could not find container \"ec9a1c428fc0db567d8a537b71a3673cd676975ee99ac12e8d76184bf822d61d\": container with ID starting with ec9a1c428fc0db567d8a537b71a3673cd676975ee99ac12e8d76184bf822d61d not found: ID does not exist" Apr 22 18:39:51.692473 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.692434 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f8c6fbfb6-vr9rj"] Apr 22 18:39:51.696865 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:51.696699 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6f8c6fbfb6-vr9rj"] Apr 22 18:39:52.933094 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:52.933061 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b29951f7-1c0b-47f1-9013-cc6997ecbc4b" path="/var/lib/kubelet/pods/b29951f7-1c0b-47f1-9013-cc6997ecbc4b/volumes" Apr 22 18:39:57.337774 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.337741 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9z6ln"] Apr 22 18:39:57.338179 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.338093 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23b12de7-c42d-44a6-bf65-2934fd21ff40" containerName="console" Apr 22 18:39:57.338179 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.338106 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b12de7-c42d-44a6-bf65-2934fd21ff40" containerName="console" Apr 22 18:39:57.338179 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.338121 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b29951f7-1c0b-47f1-9013-cc6997ecbc4b" containerName="console" Apr 22 18:39:57.338179 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.338127 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29951f7-1c0b-47f1-9013-cc6997ecbc4b" containerName="console" Apr 22 18:39:57.338304 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.338185 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b29951f7-1c0b-47f1-9013-cc6997ecbc4b" containerName="console" Apr 22 18:39:57.338304 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.338193 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="23b12de7-c42d-44a6-bf65-2934fd21ff40" containerName="console" Apr 22 18:39:57.342459 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.342440 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9z6ln" Apr 22 18:39:57.344911 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.344892 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:39:57.348837 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.348815 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9z6ln"] Apr 22 18:39:57.430886 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.430838 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f02372ee-17cf-4f7b-9ae1-04ef56a6c468-kubelet-config\") pod \"global-pull-secret-syncer-9z6ln\" (UID: \"f02372ee-17cf-4f7b-9ae1-04ef56a6c468\") " pod="kube-system/global-pull-secret-syncer-9z6ln" Apr 22 18:39:57.431116 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.430953 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f02372ee-17cf-4f7b-9ae1-04ef56a6c468-original-pull-secret\") pod \"global-pull-secret-syncer-9z6ln\" (UID: \"f02372ee-17cf-4f7b-9ae1-04ef56a6c468\") " pod="kube-system/global-pull-secret-syncer-9z6ln" Apr 22 18:39:57.431116 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.431018 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f02372ee-17cf-4f7b-9ae1-04ef56a6c468-dbus\") pod \"global-pull-secret-syncer-9z6ln\" (UID: \"f02372ee-17cf-4f7b-9ae1-04ef56a6c468\") " pod="kube-system/global-pull-secret-syncer-9z6ln" Apr 22 18:39:57.531879 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.531837 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f02372ee-17cf-4f7b-9ae1-04ef56a6c468-original-pull-secret\") pod \"global-pull-secret-syncer-9z6ln\" (UID: \"f02372ee-17cf-4f7b-9ae1-04ef56a6c468\") " pod="kube-system/global-pull-secret-syncer-9z6ln" Apr 22 18:39:57.532051 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.531913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f02372ee-17cf-4f7b-9ae1-04ef56a6c468-dbus\") pod \"global-pull-secret-syncer-9z6ln\" (UID: \"f02372ee-17cf-4f7b-9ae1-04ef56a6c468\") " pod="kube-system/global-pull-secret-syncer-9z6ln" Apr 22 18:39:57.532051 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.531983 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f02372ee-17cf-4f7b-9ae1-04ef56a6c468-kubelet-config\") pod \"global-pull-secret-syncer-9z6ln\" (UID: \"f02372ee-17cf-4f7b-9ae1-04ef56a6c468\") " pod="kube-system/global-pull-secret-syncer-9z6ln" Apr 22 18:39:57.532127 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.532089 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f02372ee-17cf-4f7b-9ae1-04ef56a6c468-kubelet-config\") pod \"global-pull-secret-syncer-9z6ln\" (UID: \"f02372ee-17cf-4f7b-9ae1-04ef56a6c468\") " pod="kube-system/global-pull-secret-syncer-9z6ln" Apr 22 18:39:57.532127 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.532110 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f02372ee-17cf-4f7b-9ae1-04ef56a6c468-dbus\") pod \"global-pull-secret-syncer-9z6ln\" (UID: \"f02372ee-17cf-4f7b-9ae1-04ef56a6c468\") " pod="kube-system/global-pull-secret-syncer-9z6ln" Apr 22 18:39:57.534311 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.534295 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f02372ee-17cf-4f7b-9ae1-04ef56a6c468-original-pull-secret\") pod \"global-pull-secret-syncer-9z6ln\" (UID: \"f02372ee-17cf-4f7b-9ae1-04ef56a6c468\") " pod="kube-system/global-pull-secret-syncer-9z6ln" Apr 22 18:39:57.652314 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.652286 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9z6ln" Apr 22 18:39:57.776042 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:57.776012 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9z6ln"] Apr 22 18:39:57.778567 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:39:57.778540 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf02372ee_17cf_4f7b_9ae1_04ef56a6c468.slice/crio-3492e76c43e3fd49f3cc4098bdcf25783b53d88540d21e0e2b93baa45c8c6c29 WatchSource:0}: Error finding container 3492e76c43e3fd49f3cc4098bdcf25783b53d88540d21e0e2b93baa45c8c6c29: Status 404 returned error can't find the container with id 3492e76c43e3fd49f3cc4098bdcf25783b53d88540d21e0e2b93baa45c8c6c29 Apr 22 18:39:58.690234 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:39:58.690186 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9z6ln" event={"ID":"f02372ee-17cf-4f7b-9ae1-04ef56a6c468","Type":"ContainerStarted","Data":"3492e76c43e3fd49f3cc4098bdcf25783b53d88540d21e0e2b93baa45c8c6c29"} Apr 22 18:40:03.707314 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:40:03.707277 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9z6ln" event={"ID":"f02372ee-17cf-4f7b-9ae1-04ef56a6c468","Type":"ContainerStarted","Data":"828b9dce22e4bce1d515d4fc4c5ce8519ec9c626c2f8a55bdd5b26dd881d1c1a"} Apr 22 18:40:03.727381 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:40:03.727330 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9z6ln" podStartSLOduration=1.725406137 podStartE2EDuration="6.727314785s" podCreationTimestamp="2026-04-22 18:39:57 +0000 UTC" firstStartedPulling="2026-04-22 18:39:57.780323701 +0000 UTC m=+193.487546113" lastFinishedPulling="2026-04-22 18:40:02.782232353 +0000 UTC m=+198.489454761" observedRunningTime="2026-04-22 18:40:03.726052632 +0000 UTC m=+199.433275061" watchObservedRunningTime="2026-04-22 18:40:03.727314785 +0000 UTC m=+199.434537212" Apr 22 18:41:16.310966 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:16.310871 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2"] Apr 22 18:41:16.314575 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:16.314550 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2" Apr 22 18:41:16.317358 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:16.317332 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:41:16.317472 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:16.317384 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:41:16.318545 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:16.318529 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-kqs99\"" Apr 22 18:41:16.323596 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:16.323567 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2"] Apr 22 18:41:16.379359 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:16.379323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a86be88d-c3a7-4811-8d69-0d1f24addf45-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2\" (UID: \"a86be88d-c3a7-4811-8d69-0d1f24addf45\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2" Apr 22 18:41:16.379359 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:16.379363 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmwjx\" (UniqueName: \"kubernetes.io/projected/a86be88d-c3a7-4811-8d69-0d1f24addf45-kube-api-access-dmwjx\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2\" (UID: \"a86be88d-c3a7-4811-8d69-0d1f24addf45\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2" Apr 22 18:41:16.379580 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:16.379442 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a86be88d-c3a7-4811-8d69-0d1f24addf45-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2\" (UID: \"a86be88d-c3a7-4811-8d69-0d1f24addf45\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2" Apr 22 18:41:16.480716 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:16.480679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a86be88d-c3a7-4811-8d69-0d1f24addf45-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2\" (UID: \"a86be88d-c3a7-4811-8d69-0d1f24addf45\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2" Apr 22 18:41:16.480879 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:16.480750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a86be88d-c3a7-4811-8d69-0d1f24addf45-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2\" (UID: \"a86be88d-c3a7-4811-8d69-0d1f24addf45\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2" Apr 22 18:41:16.480879 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:16.480776 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmwjx\" (UniqueName: \"kubernetes.io/projected/a86be88d-c3a7-4811-8d69-0d1f24addf45-kube-api-access-dmwjx\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2\" (UID: \"a86be88d-c3a7-4811-8d69-0d1f24addf45\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2" Apr 22 18:41:16.481129 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:16.481108 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a86be88d-c3a7-4811-8d69-0d1f24addf45-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2\" (UID: \"a86be88d-c3a7-4811-8d69-0d1f24addf45\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2" Apr 22 18:41:16.481167 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:16.481150 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a86be88d-c3a7-4811-8d69-0d1f24addf45-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2\" (UID: \"a86be88d-c3a7-4811-8d69-0d1f24addf45\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2" Apr 22 18:41:16.491189 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:16.491163 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmwjx\" (UniqueName: \"kubernetes.io/projected/a86be88d-c3a7-4811-8d69-0d1f24addf45-kube-api-access-dmwjx\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2\" (UID: \"a86be88d-c3a7-4811-8d69-0d1f24addf45\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2" Apr 22 18:41:16.624870 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:16.624781 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2" Apr 22 18:41:16.756106 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:16.755898 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2"] Apr 22 18:41:16.758714 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:41:16.758682 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda86be88d_c3a7_4811_8d69_0d1f24addf45.slice/crio-a3802d682e2eb9d15dd8c9040584af6e8634e598d29c8bb15f0a2b8d6cc7514c WatchSource:0}: Error finding container a3802d682e2eb9d15dd8c9040584af6e8634e598d29c8bb15f0a2b8d6cc7514c: Status 404 returned error can't find the container with id a3802d682e2eb9d15dd8c9040584af6e8634e598d29c8bb15f0a2b8d6cc7514c Apr 22 18:41:16.914296 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:16.914265 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2" event={"ID":"a86be88d-c3a7-4811-8d69-0d1f24addf45","Type":"ContainerStarted","Data":"a3802d682e2eb9d15dd8c9040584af6e8634e598d29c8bb15f0a2b8d6cc7514c"} Apr 22 18:41:21.930825 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:21.930787 2575 generic.go:358] "Generic (PLEG): container finished" podID="a86be88d-c3a7-4811-8d69-0d1f24addf45" containerID="5001750a419b139736c5f30051a4f25f78a3197f92ae3bb5632c74a418024476" exitCode=0 Apr 22 18:41:21.931229 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:21.930837 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2" event={"ID":"a86be88d-c3a7-4811-8d69-0d1f24addf45","Type":"ContainerDied","Data":"5001750a419b139736c5f30051a4f25f78a3197f92ae3bb5632c74a418024476"} Apr 22 18:41:24.940497 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:24.940466 2575 generic.go:358] "Generic (PLEG): container finished" podID="a86be88d-c3a7-4811-8d69-0d1f24addf45" containerID="5f2a4c90edf9f6192bb18dbf78662b0f631ecbbbd135d1454597b648ab7cea2f" exitCode=0 Apr 22 18:41:24.940903 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:24.940554 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2" event={"ID":"a86be88d-c3a7-4811-8d69-0d1f24addf45","Type":"ContainerDied","Data":"5f2a4c90edf9f6192bb18dbf78662b0f631ecbbbd135d1454597b648ab7cea2f"} Apr 22 18:41:33.974007 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:33.973972 2575 generic.go:358] "Generic (PLEG): container finished" podID="a86be88d-c3a7-4811-8d69-0d1f24addf45" containerID="5af6a05eb5dca5d81235b38e4e7f4c8400ca897514be4bed1bd32c3ec8a18683" exitCode=0 Apr 22 18:41:33.974395 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:33.974062 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2" event={"ID":"a86be88d-c3a7-4811-8d69-0d1f24addf45","Type":"ContainerDied","Data":"5af6a05eb5dca5d81235b38e4e7f4c8400ca897514be4bed1bd32c3ec8a18683"} Apr 22 18:41:35.102803 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:35.102777 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2" Apr 22 18:41:35.254206 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:35.254108 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmwjx\" (UniqueName: \"kubernetes.io/projected/a86be88d-c3a7-4811-8d69-0d1f24addf45-kube-api-access-dmwjx\") pod \"a86be88d-c3a7-4811-8d69-0d1f24addf45\" (UID: \"a86be88d-c3a7-4811-8d69-0d1f24addf45\") " Apr 22 18:41:35.254206 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:35.254188 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a86be88d-c3a7-4811-8d69-0d1f24addf45-bundle\") pod \"a86be88d-c3a7-4811-8d69-0d1f24addf45\" (UID: \"a86be88d-c3a7-4811-8d69-0d1f24addf45\") " Apr 22 18:41:35.254417 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:35.254294 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a86be88d-c3a7-4811-8d69-0d1f24addf45-util\") pod \"a86be88d-c3a7-4811-8d69-0d1f24addf45\" (UID: \"a86be88d-c3a7-4811-8d69-0d1f24addf45\") " Apr 22 18:41:35.254817 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:35.254793 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a86be88d-c3a7-4811-8d69-0d1f24addf45-bundle" (OuterVolumeSpecName: "bundle") pod "a86be88d-c3a7-4811-8d69-0d1f24addf45" (UID: "a86be88d-c3a7-4811-8d69-0d1f24addf45"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:41:35.256621 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:35.256596 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a86be88d-c3a7-4811-8d69-0d1f24addf45-kube-api-access-dmwjx" (OuterVolumeSpecName: "kube-api-access-dmwjx") pod "a86be88d-c3a7-4811-8d69-0d1f24addf45" (UID: "a86be88d-c3a7-4811-8d69-0d1f24addf45"). InnerVolumeSpecName "kube-api-access-dmwjx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:41:35.259537 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:35.259502 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a86be88d-c3a7-4811-8d69-0d1f24addf45-util" (OuterVolumeSpecName: "util") pod "a86be88d-c3a7-4811-8d69-0d1f24addf45" (UID: "a86be88d-c3a7-4811-8d69-0d1f24addf45"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:41:35.355637 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:35.355600 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a86be88d-c3a7-4811-8d69-0d1f24addf45-util\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:41:35.355637 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:35.355631 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dmwjx\" (UniqueName: \"kubernetes.io/projected/a86be88d-c3a7-4811-8d69-0d1f24addf45-kube-api-access-dmwjx\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:41:35.355637 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:35.355642 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a86be88d-c3a7-4811-8d69-0d1f24addf45-bundle\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:41:35.981671 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:35.981635 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2" event={"ID":"a86be88d-c3a7-4811-8d69-0d1f24addf45","Type":"ContainerDied","Data":"a3802d682e2eb9d15dd8c9040584af6e8634e598d29c8bb15f0a2b8d6cc7514c"} Apr 22 18:41:35.981671 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:35.981669 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3802d682e2eb9d15dd8c9040584af6e8634e598d29c8bb15f0a2b8d6cc7514c" Apr 22 18:41:35.981883 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:35.981683 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccpvr2" Apr 22 18:41:37.896601 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:37.896559 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx"] Apr 22 18:41:37.897199 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:37.897056 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a86be88d-c3a7-4811-8d69-0d1f24addf45" containerName="util" Apr 22 18:41:37.897199 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:37.897083 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86be88d-c3a7-4811-8d69-0d1f24addf45" containerName="util" Apr 22 18:41:37.897199 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:37.897100 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a86be88d-c3a7-4811-8d69-0d1f24addf45" containerName="extract" Apr 22 18:41:37.897199 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:37.897109 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86be88d-c3a7-4811-8d69-0d1f24addf45" containerName="extract" Apr 22 18:41:37.897199 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:37.897135 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a86be88d-c3a7-4811-8d69-0d1f24addf45" containerName="pull" Apr 22 18:41:37.897199 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:37.897143 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86be88d-c3a7-4811-8d69-0d1f24addf45" containerName="pull" Apr 22 18:41:37.897492 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:37.897232 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a86be88d-c3a7-4811-8d69-0d1f24addf45" containerName="extract" Apr 22 18:41:37.899778 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:37.899754 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx" Apr 22 18:41:37.902714 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:37.902690 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 18:41:37.902846 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:37.902744 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-7d95b\"" Apr 22 18:41:37.902846 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:37.902754 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 18:41:37.902846 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:37.902777 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 18:41:37.915489 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:37.911862 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx"] Apr 22 18:41:37.981313 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:37.981255 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjtgk\" (UniqueName: \"kubernetes.io/projected/cff2c2b8-dd4a-4cf1-9a9d-55a24a5c54c0-kube-api-access-tjtgk\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx\" (UID: \"cff2c2b8-dd4a-4cf1-9a9d-55a24a5c54c0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx" Apr 22 18:41:37.981494 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:37.981419 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/cff2c2b8-dd4a-4cf1-9a9d-55a24a5c54c0-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx\" (UID: \"cff2c2b8-dd4a-4cf1-9a9d-55a24a5c54c0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx" Apr 22 18:41:38.082721 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:38.082662 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/cff2c2b8-dd4a-4cf1-9a9d-55a24a5c54c0-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx\" (UID: \"cff2c2b8-dd4a-4cf1-9a9d-55a24a5c54c0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx" Apr 22 18:41:38.082721 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:38.082724 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjtgk\" (UniqueName: \"kubernetes.io/projected/cff2c2b8-dd4a-4cf1-9a9d-55a24a5c54c0-kube-api-access-tjtgk\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx\" (UID: \"cff2c2b8-dd4a-4cf1-9a9d-55a24a5c54c0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx" Apr 22 18:41:38.085309 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:38.085279 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/cff2c2b8-dd4a-4cf1-9a9d-55a24a5c54c0-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx\" (UID: \"cff2c2b8-dd4a-4cf1-9a9d-55a24a5c54c0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx" Apr 22 18:41:38.093312 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:38.093282 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjtgk\" (UniqueName: \"kubernetes.io/projected/cff2c2b8-dd4a-4cf1-9a9d-55a24a5c54c0-kube-api-access-tjtgk\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx\" (UID: \"cff2c2b8-dd4a-4cf1-9a9d-55a24a5c54c0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx" Apr 22 18:41:38.211075 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:38.211023 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx" Apr 22 18:41:38.347071 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:38.347039 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx"] Apr 22 18:41:38.350173 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:41:38.350142 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcff2c2b8_dd4a_4cf1_9a9d_55a24a5c54c0.slice/crio-dd14a10304abad38a352f127a7b78959f104319c26709036cb484f6f4ab0f3c0 WatchSource:0}: Error finding container dd14a10304abad38a352f127a7b78959f104319c26709036cb484f6f4ab0f3c0: Status 404 returned error can't find the container with id dd14a10304abad38a352f127a7b78959f104319c26709036cb484f6f4ab0f3c0 Apr 22 18:41:38.992359 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:38.992316 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx" event={"ID":"cff2c2b8-dd4a-4cf1-9a9d-55a24a5c54c0","Type":"ContainerStarted","Data":"dd14a10304abad38a352f127a7b78959f104319c26709036cb484f6f4ab0f3c0"} Apr 22 18:41:43.611239 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:43.611202 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-vxhm9"] Apr 22 18:41:43.613405 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:43.613384 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" Apr 22 18:41:43.616128 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:43.616107 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 18:41:43.616233 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:43.616139 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 18:41:43.616233 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:43.616161 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-sprz7\"" Apr 22 18:41:43.625186 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:43.625166 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-vxhm9"] Apr 22 18:41:43.735402 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:43.735358 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c042294c-b34a-4039-948d-f63fd64c7676-cabundle0\") pod \"keda-operator-ffbb595cb-vxhm9\" (UID: \"c042294c-b34a-4039-948d-f63fd64c7676\") " pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" Apr 22 18:41:43.735575 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:43.735417 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7sbg\" (UniqueName: \"kubernetes.io/projected/c042294c-b34a-4039-948d-f63fd64c7676-kube-api-access-n7sbg\") pod \"keda-operator-ffbb595cb-vxhm9\" (UID: \"c042294c-b34a-4039-948d-f63fd64c7676\") " pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" Apr 22 18:41:43.735575 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:43.735460 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c042294c-b34a-4039-948d-f63fd64c7676-certificates\") pod \"keda-operator-ffbb595cb-vxhm9\" (UID: \"c042294c-b34a-4039-948d-f63fd64c7676\") " pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" Apr 22 18:41:43.836494 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:43.836460 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c042294c-b34a-4039-948d-f63fd64c7676-cabundle0\") pod \"keda-operator-ffbb595cb-vxhm9\" (UID: \"c042294c-b34a-4039-948d-f63fd64c7676\") " pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" Apr 22 18:41:43.836682 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:43.836510 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7sbg\" (UniqueName: \"kubernetes.io/projected/c042294c-b34a-4039-948d-f63fd64c7676-kube-api-access-n7sbg\") pod \"keda-operator-ffbb595cb-vxhm9\" (UID: \"c042294c-b34a-4039-948d-f63fd64c7676\") " pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" Apr 22 18:41:43.836682 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:43.836551 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c042294c-b34a-4039-948d-f63fd64c7676-certificates\") pod \"keda-operator-ffbb595cb-vxhm9\" (UID: \"c042294c-b34a-4039-948d-f63fd64c7676\") " pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" Apr 22 18:41:43.836760 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:43.836691 2575 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:41:43.836760 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:43.836707 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:41:43.836760 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:43.836719 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-vxhm9: references non-existent secret key: ca.crt Apr 22 18:41:43.836875 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:43.836790 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c042294c-b34a-4039-948d-f63fd64c7676-certificates podName:c042294c-b34a-4039-948d-f63fd64c7676 nodeName:}" failed. No retries permitted until 2026-04-22 18:41:44.336767367 +0000 UTC m=+300.043989773 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c042294c-b34a-4039-948d-f63fd64c7676-certificates") pod "keda-operator-ffbb595cb-vxhm9" (UID: "c042294c-b34a-4039-948d-f63fd64c7676") : references non-existent secret key: ca.crt Apr 22 18:41:43.837198 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:43.837180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c042294c-b34a-4039-948d-f63fd64c7676-cabundle0\") pod \"keda-operator-ffbb595cb-vxhm9\" (UID: \"c042294c-b34a-4039-948d-f63fd64c7676\") " pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" Apr 22 18:41:43.846053 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:43.846025 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7sbg\" (UniqueName: \"kubernetes.io/projected/c042294c-b34a-4039-948d-f63fd64c7676-kube-api-access-n7sbg\") pod \"keda-operator-ffbb595cb-vxhm9\" (UID: \"c042294c-b34a-4039-948d-f63fd64c7676\") " pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" Apr 22 18:41:43.872663 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:43.872591 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt"] Apr 22 18:41:43.874989 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:43.874970 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" Apr 22 18:41:43.877472 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:43.877450 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 18:41:43.884554 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:43.884526 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt"] Apr 22 18:41:44.009727 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.009677 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx" event={"ID":"cff2c2b8-dd4a-4cf1-9a9d-55a24a5c54c0","Type":"ContainerStarted","Data":"b603fd0fa730cc327f6503e3e4da0f7cfcf4bb7ddb2b3ec85607ffde1d64d914"} Apr 22 18:41:44.009915 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.009803 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx" Apr 22 18:41:44.031320 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.031254 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx" podStartSLOduration=2.330649416 podStartE2EDuration="7.031238611s" podCreationTimestamp="2026-04-22 18:41:37 +0000 UTC" firstStartedPulling="2026-04-22 18:41:38.352400047 +0000 UTC m=+294.059622453" lastFinishedPulling="2026-04-22 18:41:43.052989243 +0000 UTC m=+298.760211648" observedRunningTime="2026-04-22 18:41:44.029349192 +0000 UTC m=+299.736571621" watchObservedRunningTime="2026-04-22 18:41:44.031238611 +0000 UTC m=+299.738461104" Apr 22 18:41:44.038264 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.038233 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/af2716ab-a85a-432c-803b-8f6cebbefccc-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-c9fpt\" (UID: \"af2716ab-a85a-432c-803b-8f6cebbefccc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" Apr 22 18:41:44.038414 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.038388 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/af2716ab-a85a-432c-803b-8f6cebbefccc-certificates\") pod \"keda-metrics-apiserver-7c9f485588-c9fpt\" (UID: \"af2716ab-a85a-432c-803b-8f6cebbefccc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" Apr 22 18:41:44.038460 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.038433 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn8j2\" (UniqueName: \"kubernetes.io/projected/af2716ab-a85a-432c-803b-8f6cebbefccc-kube-api-access-pn8j2\") pod \"keda-metrics-apiserver-7c9f485588-c9fpt\" (UID: \"af2716ab-a85a-432c-803b-8f6cebbefccc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" Apr 22 18:41:44.100523 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.100492 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-lp7lc"] Apr 22 18:41:44.102895 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.102872 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-lp7lc" Apr 22 18:41:44.105358 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.105322 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 18:41:44.112557 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.112528 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-lp7lc"] Apr 22 18:41:44.139718 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.139629 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/af2716ab-a85a-432c-803b-8f6cebbefccc-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-c9fpt\" (UID: \"af2716ab-a85a-432c-803b-8f6cebbefccc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" Apr 22 18:41:44.139880 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.139733 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/af2716ab-a85a-432c-803b-8f6cebbefccc-certificates\") pod \"keda-metrics-apiserver-7c9f485588-c9fpt\" (UID: \"af2716ab-a85a-432c-803b-8f6cebbefccc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" Apr 22 18:41:44.139880 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.139762 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pn8j2\" (UniqueName: \"kubernetes.io/projected/af2716ab-a85a-432c-803b-8f6cebbefccc-kube-api-access-pn8j2\") pod \"keda-metrics-apiserver-7c9f485588-c9fpt\" (UID: \"af2716ab-a85a-432c-803b-8f6cebbefccc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" Apr 22 18:41:44.139880 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:44.139849 2575 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:41:44.139880 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:44.139871 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:41:44.140138 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:44.139893 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt: references non-existent secret key: tls.crt Apr 22 18:41:44.140138 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:44.139995 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/af2716ab-a85a-432c-803b-8f6cebbefccc-certificates podName:af2716ab-a85a-432c-803b-8f6cebbefccc nodeName:}" failed. No retries permitted until 2026-04-22 18:41:44.639973877 +0000 UTC m=+300.347196288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/af2716ab-a85a-432c-803b-8f6cebbefccc-certificates") pod "keda-metrics-apiserver-7c9f485588-c9fpt" (UID: "af2716ab-a85a-432c-803b-8f6cebbefccc") : references non-existent secret key: tls.crt Apr 22 18:41:44.140138 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.140098 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/af2716ab-a85a-432c-803b-8f6cebbefccc-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-c9fpt\" (UID: \"af2716ab-a85a-432c-803b-8f6cebbefccc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" Apr 22 18:41:44.151866 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.151841 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn8j2\" (UniqueName: \"kubernetes.io/projected/af2716ab-a85a-432c-803b-8f6cebbefccc-kube-api-access-pn8j2\") pod \"keda-metrics-apiserver-7c9f485588-c9fpt\" (UID: \"af2716ab-a85a-432c-803b-8f6cebbefccc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" Apr 22 18:41:44.240333 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.240295 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8a241a04-b046-463a-9412-68d0d4f4b12f-certificates\") pod \"keda-admission-cf49989db-lp7lc\" (UID: \"8a241a04-b046-463a-9412-68d0d4f4b12f\") " pod="openshift-keda/keda-admission-cf49989db-lp7lc" Apr 22 18:41:44.240520 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.240339 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fq4r\" (UniqueName: \"kubernetes.io/projected/8a241a04-b046-463a-9412-68d0d4f4b12f-kube-api-access-7fq4r\") pod \"keda-admission-cf49989db-lp7lc\" (UID: \"8a241a04-b046-463a-9412-68d0d4f4b12f\") " pod="openshift-keda/keda-admission-cf49989db-lp7lc" Apr 22 18:41:44.343221 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.341820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c042294c-b34a-4039-948d-f63fd64c7676-certificates\") pod \"keda-operator-ffbb595cb-vxhm9\" (UID: \"c042294c-b34a-4039-948d-f63fd64c7676\") " pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" Apr 22 18:41:44.343221 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.341914 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8a241a04-b046-463a-9412-68d0d4f4b12f-certificates\") pod \"keda-admission-cf49989db-lp7lc\" (UID: \"8a241a04-b046-463a-9412-68d0d4f4b12f\") " pod="openshift-keda/keda-admission-cf49989db-lp7lc" Apr 22 18:41:44.343221 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.341958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fq4r\" (UniqueName: \"kubernetes.io/projected/8a241a04-b046-463a-9412-68d0d4f4b12f-kube-api-access-7fq4r\") pod \"keda-admission-cf49989db-lp7lc\" (UID: \"8a241a04-b046-463a-9412-68d0d4f4b12f\") " pod="openshift-keda/keda-admission-cf49989db-lp7lc" Apr 22 18:41:44.343221 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:44.342413 2575 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:41:44.343221 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:44.342432 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:41:44.343221 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:44.342443 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-vxhm9: references non-existent secret key: ca.crt Apr 22 18:41:44.343221 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:44.342503 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c042294c-b34a-4039-948d-f63fd64c7676-certificates podName:c042294c-b34a-4039-948d-f63fd64c7676 nodeName:}" failed. No retries permitted until 2026-04-22 18:41:45.342484006 +0000 UTC m=+301.049706419 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c042294c-b34a-4039-948d-f63fd64c7676-certificates") pod "keda-operator-ffbb595cb-vxhm9" (UID: "c042294c-b34a-4039-948d-f63fd64c7676") : references non-existent secret key: ca.crt Apr 22 18:41:44.346214 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.346184 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8a241a04-b046-463a-9412-68d0d4f4b12f-certificates\") pod \"keda-admission-cf49989db-lp7lc\" (UID: \"8a241a04-b046-463a-9412-68d0d4f4b12f\") " pod="openshift-keda/keda-admission-cf49989db-lp7lc" Apr 22 18:41:44.353256 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.352103 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fq4r\" (UniqueName: \"kubernetes.io/projected/8a241a04-b046-463a-9412-68d0d4f4b12f-kube-api-access-7fq4r\") pod \"keda-admission-cf49989db-lp7lc\" (UID: \"8a241a04-b046-463a-9412-68d0d4f4b12f\") " pod="openshift-keda/keda-admission-cf49989db-lp7lc" Apr 22 18:41:44.414977 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.414919 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-lp7lc" Apr 22 18:41:44.564166 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.564135 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-lp7lc"] Apr 22 18:41:44.567278 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:41:44.567231 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a241a04_b046_463a_9412_68d0d4f4b12f.slice/crio-cd78962118bbdd4c25b3d4b9120ce4700bc76833b93787edd10d2cea1908c45c WatchSource:0}: Error finding container cd78962118bbdd4c25b3d4b9120ce4700bc76833b93787edd10d2cea1908c45c: Status 404 returned error can't find the container with id cd78962118bbdd4c25b3d4b9120ce4700bc76833b93787edd10d2cea1908c45c Apr 22 18:41:44.645324 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.645286 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/af2716ab-a85a-432c-803b-8f6cebbefccc-certificates\") pod \"keda-metrics-apiserver-7c9f485588-c9fpt\" (UID: \"af2716ab-a85a-432c-803b-8f6cebbefccc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" Apr 22 18:41:44.645727 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:44.645391 2575 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:41:44.645727 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:44.645409 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:41:44.645727 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:44.645426 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt: references non-existent secret key: tls.crt Apr 22 18:41:44.645727 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:44.645484 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/af2716ab-a85a-432c-803b-8f6cebbefccc-certificates podName:af2716ab-a85a-432c-803b-8f6cebbefccc nodeName:}" failed. No retries permitted until 2026-04-22 18:41:45.645466405 +0000 UTC m=+301.352688811 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/af2716ab-a85a-432c-803b-8f6cebbefccc-certificates") pod "keda-metrics-apiserver-7c9f485588-c9fpt" (UID: "af2716ab-a85a-432c-803b-8f6cebbefccc") : references non-existent secret key: tls.crt Apr 22 18:41:44.796162 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:44.796098 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:41:45.014252 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:45.014194 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-lp7lc" event={"ID":"8a241a04-b046-463a-9412-68d0d4f4b12f","Type":"ContainerStarted","Data":"cd78962118bbdd4c25b3d4b9120ce4700bc76833b93787edd10d2cea1908c45c"} Apr 22 18:41:45.352869 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:45.352835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c042294c-b34a-4039-948d-f63fd64c7676-certificates\") pod \"keda-operator-ffbb595cb-vxhm9\" (UID: \"c042294c-b34a-4039-948d-f63fd64c7676\") " pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" Apr 22 18:41:45.353090 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:45.353033 2575 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:41:45.353090 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:45.353054 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:41:45.353090 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:45.353064 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-vxhm9: references non-existent secret key: ca.crt Apr 22 18:41:45.353246 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:45.353121 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c042294c-b34a-4039-948d-f63fd64c7676-certificates podName:c042294c-b34a-4039-948d-f63fd64c7676 nodeName:}" failed. No retries permitted until 2026-04-22 18:41:47.353100841 +0000 UTC m=+303.060323247 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c042294c-b34a-4039-948d-f63fd64c7676-certificates") pod "keda-operator-ffbb595cb-vxhm9" (UID: "c042294c-b34a-4039-948d-f63fd64c7676") : references non-existent secret key: ca.crt Apr 22 18:41:45.656052 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:45.656027 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/af2716ab-a85a-432c-803b-8f6cebbefccc-certificates\") pod \"keda-metrics-apiserver-7c9f485588-c9fpt\" (UID: \"af2716ab-a85a-432c-803b-8f6cebbefccc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" Apr 22 18:41:45.656334 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:45.656199 2575 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:41:45.656334 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:45.656222 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:41:45.656334 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:45.656243 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt: references non-existent secret key: tls.crt Apr 22 18:41:45.656334 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:41:45.656308 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/af2716ab-a85a-432c-803b-8f6cebbefccc-certificates podName:af2716ab-a85a-432c-803b-8f6cebbefccc nodeName:}" failed. No retries permitted until 2026-04-22 18:41:47.656288183 +0000 UTC m=+303.363510611 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/af2716ab-a85a-432c-803b-8f6cebbefccc-certificates") pod "keda-metrics-apiserver-7c9f485588-c9fpt" (UID: "af2716ab-a85a-432c-803b-8f6cebbefccc") : references non-existent secret key: tls.crt Apr 22 18:41:46.023845 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:46.023802 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-lp7lc" event={"ID":"8a241a04-b046-463a-9412-68d0d4f4b12f","Type":"ContainerStarted","Data":"a10a00bdceb20980aad843adb5acdfe194ce6deda762a7a5ba9e0f4809c6a741"} Apr 22 18:41:46.024071 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:46.023995 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-lp7lc" Apr 22 18:41:46.044722 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:46.044663 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-lp7lc" podStartSLOduration=1.020266211 podStartE2EDuration="2.044648759s" podCreationTimestamp="2026-04-22 18:41:44 +0000 UTC" firstStartedPulling="2026-04-22 18:41:44.568536733 +0000 UTC m=+300.275759139" lastFinishedPulling="2026-04-22 18:41:45.592919272 +0000 UTC m=+301.300141687" observedRunningTime="2026-04-22 18:41:46.043400907 +0000 UTC m=+301.750623335" watchObservedRunningTime="2026-04-22 18:41:46.044648759 +0000 UTC m=+301.751871187" Apr 22 18:41:47.373460 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:47.373416 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c042294c-b34a-4039-948d-f63fd64c7676-certificates\") pod \"keda-operator-ffbb595cb-vxhm9\" (UID: \"c042294c-b34a-4039-948d-f63fd64c7676\") " pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" Apr 22 18:41:47.376063 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:47.376041 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c042294c-b34a-4039-948d-f63fd64c7676-certificates\") pod \"keda-operator-ffbb595cb-vxhm9\" (UID: \"c042294c-b34a-4039-948d-f63fd64c7676\") " pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" Apr 22 18:41:47.524014 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:47.523975 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" Apr 22 18:41:47.650588 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:47.650566 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-vxhm9"] Apr 22 18:41:47.652896 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:41:47.652860 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc042294c_b34a_4039_948d_f63fd64c7676.slice/crio-2303bd291b068fde4cd43d94f7a2442beb205fb757ccb46c23668ff49ba6b88a WatchSource:0}: Error finding container 2303bd291b068fde4cd43d94f7a2442beb205fb757ccb46c23668ff49ba6b88a: Status 404 returned error can't find the container with id 2303bd291b068fde4cd43d94f7a2442beb205fb757ccb46c23668ff49ba6b88a Apr 22 18:41:47.654215 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:47.654198 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:41:47.677816 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:47.677778 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/af2716ab-a85a-432c-803b-8f6cebbefccc-certificates\") pod \"keda-metrics-apiserver-7c9f485588-c9fpt\" (UID: \"af2716ab-a85a-432c-803b-8f6cebbefccc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" Apr 22 18:41:47.680403 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:47.680384 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/af2716ab-a85a-432c-803b-8f6cebbefccc-certificates\") pod \"keda-metrics-apiserver-7c9f485588-c9fpt\" (UID: \"af2716ab-a85a-432c-803b-8f6cebbefccc\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" Apr 22 18:41:47.786244 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:47.786187 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" Apr 22 18:41:47.911899 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:47.911867 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt"] Apr 22 18:41:47.914224 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:41:47.914195 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf2716ab_a85a_432c_803b_8f6cebbefccc.slice/crio-f72dd6276e14f40b46c357c3acdfa4ee24d0ca1f6cc40a3c17e962263c182643 WatchSource:0}: Error finding container f72dd6276e14f40b46c357c3acdfa4ee24d0ca1f6cc40a3c17e962263c182643: Status 404 returned error can't find the container with id f72dd6276e14f40b46c357c3acdfa4ee24d0ca1f6cc40a3c17e962263c182643 Apr 22 18:41:48.031853 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:48.031813 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" event={"ID":"c042294c-b34a-4039-948d-f63fd64c7676","Type":"ContainerStarted","Data":"2303bd291b068fde4cd43d94f7a2442beb205fb757ccb46c23668ff49ba6b88a"} Apr 22 18:41:48.032782 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:48.032758 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" event={"ID":"af2716ab-a85a-432c-803b-8f6cebbefccc","Type":"ContainerStarted","Data":"f72dd6276e14f40b46c357c3acdfa4ee24d0ca1f6cc40a3c17e962263c182643"} Apr 22 18:41:52.048165 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:52.048120 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" event={"ID":"af2716ab-a85a-432c-803b-8f6cebbefccc","Type":"ContainerStarted","Data":"d9b0c84f65028b38abe93d9d787bc5e80c715323be11c63723f782a7e6b0fd02"} Apr 22 18:41:52.048657 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:52.048309 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" Apr 22 18:41:52.049472 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:52.049448 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" event={"ID":"c042294c-b34a-4039-948d-f63fd64c7676","Type":"ContainerStarted","Data":"892596e9bba2378469d3e56f8d81ac12cef32d42c3d0e711212a0f1c3de98f04"} Apr 22 18:41:52.049600 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:52.049588 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" Apr 22 18:41:52.068360 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:52.068310 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" podStartSLOduration=5.533254378 podStartE2EDuration="9.068294646s" podCreationTimestamp="2026-04-22 18:41:43 +0000 UTC" firstStartedPulling="2026-04-22 18:41:47.915645963 +0000 UTC m=+303.622868368" lastFinishedPulling="2026-04-22 18:41:51.450686212 +0000 UTC m=+307.157908636" observedRunningTime="2026-04-22 18:41:52.067791937 +0000 UTC m=+307.775014406" watchObservedRunningTime="2026-04-22 18:41:52.068294646 +0000 UTC m=+307.775517073" Apr 22 18:41:52.091818 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:41:52.091760 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" podStartSLOduration=5.2937525690000005 podStartE2EDuration="9.091746052s" podCreationTimestamp="2026-04-22 18:41:43 +0000 UTC" firstStartedPulling="2026-04-22 18:41:47.654322089 +0000 UTC m=+303.361544495" lastFinishedPulling="2026-04-22 18:41:51.452315572 +0000 UTC m=+307.159537978" observedRunningTime="2026-04-22 18:41:52.090537254 +0000 UTC m=+307.797759705" watchObservedRunningTime="2026-04-22 18:41:52.091746052 +0000 UTC m=+307.798968480" Apr 22 18:42:03.058298 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:03.058265 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c9fpt" Apr 22 18:42:05.016687 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:05.016653 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-v7gvx" Apr 22 18:42:07.029335 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:07.029301 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-lp7lc" Apr 22 18:42:13.055894 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:13.055863 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-vxhm9" Apr 22 18:42:52.319959 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.319908 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-xn97h"] Apr 22 18:42:52.323119 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.323080 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-gn6l8"] Apr 22 18:42:52.323256 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.323140 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" Apr 22 18:42:52.327025 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.326998 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:42:52.327377 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.327117 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:42:52.329241 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.328258 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 18:42:52.329241 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.328885 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gn6l8" Apr 22 18:42:52.329241 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.329009 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-hf9bc\"" Apr 22 18:42:52.332023 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.332000 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 18:42:52.332189 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.332159 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-2p42w\"" Apr 22 18:42:52.336159 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.336137 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-xn97h"] Apr 22 18:42:52.338537 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.338514 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-gn6l8"] Apr 22 18:42:52.431861 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.431823 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-gn6l8\" (UID: \"6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gn6l8" Apr 22 18:42:52.431861 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.431869 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cdb24d7-91fc-426a-9656-6d7eae83d596-cert\") pod \"kserve-controller-manager-d9c56dd68-xn97h\" (UID: \"9cdb24d7-91fc-426a-9656-6d7eae83d596\") " pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" Apr 22 18:42:52.432119 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.431900 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tht4w\" (UniqueName: \"kubernetes.io/projected/6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69-kube-api-access-tht4w\") pod \"llmisvc-controller-manager-68cc5db7c4-gn6l8\" (UID: \"6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gn6l8" Apr 22 18:42:52.432119 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.432025 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk4tf\" (UniqueName: \"kubernetes.io/projected/9cdb24d7-91fc-426a-9656-6d7eae83d596-kube-api-access-bk4tf\") pod \"kserve-controller-manager-d9c56dd68-xn97h\" (UID: \"9cdb24d7-91fc-426a-9656-6d7eae83d596\") " pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" Apr 22 18:42:52.532963 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.532909 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-gn6l8\" (UID: \"6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gn6l8" Apr 22 18:42:52.533152 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.532974 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cdb24d7-91fc-426a-9656-6d7eae83d596-cert\") pod \"kserve-controller-manager-d9c56dd68-xn97h\" (UID: \"9cdb24d7-91fc-426a-9656-6d7eae83d596\") " pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" Apr 22 18:42:52.533152 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.533020 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tht4w\" (UniqueName: \"kubernetes.io/projected/6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69-kube-api-access-tht4w\") pod \"llmisvc-controller-manager-68cc5db7c4-gn6l8\" (UID: \"6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gn6l8" Apr 22 18:42:52.533152 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.533070 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bk4tf\" (UniqueName: \"kubernetes.io/projected/9cdb24d7-91fc-426a-9656-6d7eae83d596-kube-api-access-bk4tf\") pod \"kserve-controller-manager-d9c56dd68-xn97h\" (UID: \"9cdb24d7-91fc-426a-9656-6d7eae83d596\") " pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" Apr 22 18:42:52.533152 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:42:52.533068 2575 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 22 18:42:52.533152 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:42:52.533124 2575 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 22 18:42:52.533350 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:42:52.533189 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cdb24d7-91fc-426a-9656-6d7eae83d596-cert podName:9cdb24d7-91fc-426a-9656-6d7eae83d596 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:53.033171288 +0000 UTC m=+368.740393695 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9cdb24d7-91fc-426a-9656-6d7eae83d596-cert") pod "kserve-controller-manager-d9c56dd68-xn97h" (UID: "9cdb24d7-91fc-426a-9656-6d7eae83d596") : secret "kserve-webhook-server-cert" not found Apr 22 18:42:52.533350 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:42:52.533205 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69-cert podName:6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:53.033197672 +0000 UTC m=+368.740420077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69-cert") pod "llmisvc-controller-manager-68cc5db7c4-gn6l8" (UID: "6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69") : secret "llmisvc-webhook-server-cert" not found Apr 22 18:42:52.546961 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.546906 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk4tf\" (UniqueName: \"kubernetes.io/projected/9cdb24d7-91fc-426a-9656-6d7eae83d596-kube-api-access-bk4tf\") pod \"kserve-controller-manager-d9c56dd68-xn97h\" (UID: \"9cdb24d7-91fc-426a-9656-6d7eae83d596\") " pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" Apr 22 18:42:52.547077 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:52.546943 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tht4w\" (UniqueName: \"kubernetes.io/projected/6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69-kube-api-access-tht4w\") pod \"llmisvc-controller-manager-68cc5db7c4-gn6l8\" (UID: \"6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gn6l8" Apr 22 18:42:53.037278 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:53.037236 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-gn6l8\" (UID: \"6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gn6l8" Apr 22 18:42:53.037474 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:53.037298 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cdb24d7-91fc-426a-9656-6d7eae83d596-cert\") pod \"kserve-controller-manager-d9c56dd68-xn97h\" (UID: \"9cdb24d7-91fc-426a-9656-6d7eae83d596\") " pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" Apr 22 18:42:53.039946 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:53.039903 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cdb24d7-91fc-426a-9656-6d7eae83d596-cert\") pod \"kserve-controller-manager-d9c56dd68-xn97h\" (UID: \"9cdb24d7-91fc-426a-9656-6d7eae83d596\") " pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" Apr 22 18:42:53.040060 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:53.039994 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-gn6l8\" (UID: \"6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gn6l8" Apr 22 18:42:53.240360 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:53.240322 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" Apr 22 18:42:53.247212 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:53.247186 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gn6l8" Apr 22 18:42:53.379128 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:53.379094 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-xn97h"] Apr 22 18:42:53.381123 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:42:53.381085 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cdb24d7_91fc_426a_9656_6d7eae83d596.slice/crio-2172e853da54dc81c1bc23e7113158c18f9cd155b4c491d50047007657e1cb48 WatchSource:0}: Error finding container 2172e853da54dc81c1bc23e7113158c18f9cd155b4c491d50047007657e1cb48: Status 404 returned error can't find the container with id 2172e853da54dc81c1bc23e7113158c18f9cd155b4c491d50047007657e1cb48 Apr 22 18:42:53.402512 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:53.402490 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-gn6l8"] Apr 22 18:42:53.404957 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:42:53.404913 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6b959351_ba4a_4bfa_8aeb_9ba4e0e2ae69.slice/crio-c64a776d6b192b341afcc60f6bafdefe5b7373f76b71afd42ac75eac1a9befd0 WatchSource:0}: Error finding container c64a776d6b192b341afcc60f6bafdefe5b7373f76b71afd42ac75eac1a9befd0: Status 404 returned error can't find the container with id c64a776d6b192b341afcc60f6bafdefe5b7373f76b71afd42ac75eac1a9befd0 Apr 22 18:42:54.264782 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:54.264737 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gn6l8" event={"ID":"6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69","Type":"ContainerStarted","Data":"c64a776d6b192b341afcc60f6bafdefe5b7373f76b71afd42ac75eac1a9befd0"} Apr 22 18:42:54.266876 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:54.266815 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" event={"ID":"9cdb24d7-91fc-426a-9656-6d7eae83d596","Type":"ContainerStarted","Data":"2172e853da54dc81c1bc23e7113158c18f9cd155b4c491d50047007657e1cb48"} Apr 22 18:42:57.279249 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:57.279211 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" event={"ID":"9cdb24d7-91fc-426a-9656-6d7eae83d596","Type":"ContainerStarted","Data":"655ba7bfcac59a1ef247d7bb9d4eba79bae9f64bb349ce0b722e1c5c1f686500"} Apr 22 18:42:57.279660 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:57.279316 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" Apr 22 18:42:57.280492 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:57.280467 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gn6l8" event={"ID":"6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69","Type":"ContainerStarted","Data":"1f8cff6e12f410e17fe34afc4700906dbf63bb663a4afbda72d638495fe903cc"} Apr 22 18:42:57.280631 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:57.280616 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gn6l8" Apr 22 18:42:57.298250 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:57.298190 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" podStartSLOduration=2.002146161 podStartE2EDuration="5.298173953s" podCreationTimestamp="2026-04-22 18:42:52 +0000 UTC" firstStartedPulling="2026-04-22 18:42:53.382657954 +0000 UTC m=+369.089880361" lastFinishedPulling="2026-04-22 18:42:56.678685734 +0000 UTC m=+372.385908153" observedRunningTime="2026-04-22 18:42:57.296735074 +0000 UTC m=+373.003957503" watchObservedRunningTime="2026-04-22 18:42:57.298173953 +0000 UTC m=+373.005396380" Apr 22 18:42:57.319899 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:42:57.319851 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gn6l8" podStartSLOduration=2.046989003 podStartE2EDuration="5.319835641s" podCreationTimestamp="2026-04-22 18:42:52 +0000 UTC" firstStartedPulling="2026-04-22 18:42:53.406242658 +0000 UTC m=+369.113465064" lastFinishedPulling="2026-04-22 18:42:56.679089292 +0000 UTC m=+372.386311702" observedRunningTime="2026-04-22 18:42:57.318694928 +0000 UTC m=+373.025917356" watchObservedRunningTime="2026-04-22 18:42:57.319835641 +0000 UTC m=+373.027058095" Apr 22 18:43:28.286587 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:28.286549 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gn6l8" Apr 22 18:43:28.289633 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:28.289610 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" Apr 22 18:43:29.747112 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:29.747073 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-xn97h"] Apr 22 18:43:29.747537 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:29.747331 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" podUID="9cdb24d7-91fc-426a-9656-6d7eae83d596" containerName="manager" containerID="cri-o://655ba7bfcac59a1ef247d7bb9d4eba79bae9f64bb349ce0b722e1c5c1f686500" gracePeriod=10 Apr 22 18:43:29.804942 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:29.804888 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-wf2bl"] Apr 22 18:43:29.834589 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:29.834555 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-wf2bl"] Apr 22 18:43:29.834714 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:29.834610 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-d9c56dd68-wf2bl" Apr 22 18:43:29.965495 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:29.965447 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa05c060-3d04-4ea5-a0c7-1529891918be-cert\") pod \"kserve-controller-manager-d9c56dd68-wf2bl\" (UID: \"fa05c060-3d04-4ea5-a0c7-1529891918be\") " pod="kserve/kserve-controller-manager-d9c56dd68-wf2bl" Apr 22 18:43:29.965495 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:29.965495 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzpmp\" (UniqueName: \"kubernetes.io/projected/fa05c060-3d04-4ea5-a0c7-1529891918be-kube-api-access-tzpmp\") pod \"kserve-controller-manager-d9c56dd68-wf2bl\" (UID: \"fa05c060-3d04-4ea5-a0c7-1529891918be\") " pod="kserve/kserve-controller-manager-d9c56dd68-wf2bl" Apr 22 18:43:30.004832 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.004777 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" Apr 22 18:43:30.066323 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.066288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa05c060-3d04-4ea5-a0c7-1529891918be-cert\") pod \"kserve-controller-manager-d9c56dd68-wf2bl\" (UID: \"fa05c060-3d04-4ea5-a0c7-1529891918be\") " pod="kserve/kserve-controller-manager-d9c56dd68-wf2bl" Apr 22 18:43:30.066523 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.066350 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzpmp\" (UniqueName: \"kubernetes.io/projected/fa05c060-3d04-4ea5-a0c7-1529891918be-kube-api-access-tzpmp\") pod \"kserve-controller-manager-d9c56dd68-wf2bl\" (UID: \"fa05c060-3d04-4ea5-a0c7-1529891918be\") " pod="kserve/kserve-controller-manager-d9c56dd68-wf2bl" Apr 22 18:43:30.068825 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.068797 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa05c060-3d04-4ea5-a0c7-1529891918be-cert\") pod \"kserve-controller-manager-d9c56dd68-wf2bl\" (UID: \"fa05c060-3d04-4ea5-a0c7-1529891918be\") " pod="kserve/kserve-controller-manager-d9c56dd68-wf2bl" Apr 22 18:43:30.080265 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.080229 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzpmp\" (UniqueName: \"kubernetes.io/projected/fa05c060-3d04-4ea5-a0c7-1529891918be-kube-api-access-tzpmp\") pod \"kserve-controller-manager-d9c56dd68-wf2bl\" (UID: \"fa05c060-3d04-4ea5-a0c7-1529891918be\") " pod="kserve/kserve-controller-manager-d9c56dd68-wf2bl" Apr 22 18:43:30.167207 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.167167 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk4tf\" (UniqueName: \"kubernetes.io/projected/9cdb24d7-91fc-426a-9656-6d7eae83d596-kube-api-access-bk4tf\") pod \"9cdb24d7-91fc-426a-9656-6d7eae83d596\" (UID: \"9cdb24d7-91fc-426a-9656-6d7eae83d596\") " Apr 22 18:43:30.167377 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.167256 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cdb24d7-91fc-426a-9656-6d7eae83d596-cert\") pod \"9cdb24d7-91fc-426a-9656-6d7eae83d596\" (UID: \"9cdb24d7-91fc-426a-9656-6d7eae83d596\") " Apr 22 18:43:30.169496 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.169468 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cdb24d7-91fc-426a-9656-6d7eae83d596-cert" (OuterVolumeSpecName: "cert") pod "9cdb24d7-91fc-426a-9656-6d7eae83d596" (UID: "9cdb24d7-91fc-426a-9656-6d7eae83d596"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:43:30.169595 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.169492 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cdb24d7-91fc-426a-9656-6d7eae83d596-kube-api-access-bk4tf" (OuterVolumeSpecName: "kube-api-access-bk4tf") pod "9cdb24d7-91fc-426a-9656-6d7eae83d596" (UID: "9cdb24d7-91fc-426a-9656-6d7eae83d596"). InnerVolumeSpecName "kube-api-access-bk4tf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:43:30.194861 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.194816 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-d9c56dd68-wf2bl" Apr 22 18:43:30.268457 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.268378 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bk4tf\" (UniqueName: \"kubernetes.io/projected/9cdb24d7-91fc-426a-9656-6d7eae83d596-kube-api-access-bk4tf\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:43:30.268457 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.268405 2575 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9cdb24d7-91fc-426a-9656-6d7eae83d596-cert\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:43:30.335583 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.335549 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-wf2bl"] Apr 22 18:43:30.338796 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:43:30.338768 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa05c060_3d04_4ea5_a0c7_1529891918be.slice/crio-9bcc59f07d2719be8656b11337c708c73fd391103b3c881ee2122fd623686195 WatchSource:0}: Error finding container 9bcc59f07d2719be8656b11337c708c73fd391103b3c881ee2122fd623686195: Status 404 returned error can't find the container with id 9bcc59f07d2719be8656b11337c708c73fd391103b3c881ee2122fd623686195 Apr 22 18:43:30.387958 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.387901 2575 generic.go:358] "Generic (PLEG): container finished" podID="9cdb24d7-91fc-426a-9656-6d7eae83d596" containerID="655ba7bfcac59a1ef247d7bb9d4eba79bae9f64bb349ce0b722e1c5c1f686500" exitCode=0 Apr 22 18:43:30.388153 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.387994 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" event={"ID":"9cdb24d7-91fc-426a-9656-6d7eae83d596","Type":"ContainerDied","Data":"655ba7bfcac59a1ef247d7bb9d4eba79bae9f64bb349ce0b722e1c5c1f686500"} Apr 22 18:43:30.388153 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.388032 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" event={"ID":"9cdb24d7-91fc-426a-9656-6d7eae83d596","Type":"ContainerDied","Data":"2172e853da54dc81c1bc23e7113158c18f9cd155b4c491d50047007657e1cb48"} Apr 22 18:43:30.388153 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.388000 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-d9c56dd68-xn97h" Apr 22 18:43:30.388153 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.388055 2575 scope.go:117] "RemoveContainer" containerID="655ba7bfcac59a1ef247d7bb9d4eba79bae9f64bb349ce0b722e1c5c1f686500" Apr 22 18:43:30.389270 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.389247 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-d9c56dd68-wf2bl" event={"ID":"fa05c060-3d04-4ea5-a0c7-1529891918be","Type":"ContainerStarted","Data":"9bcc59f07d2719be8656b11337c708c73fd391103b3c881ee2122fd623686195"} Apr 22 18:43:30.396727 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.396710 2575 scope.go:117] "RemoveContainer" containerID="655ba7bfcac59a1ef247d7bb9d4eba79bae9f64bb349ce0b722e1c5c1f686500" Apr 22 18:43:30.397003 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:43:30.396979 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"655ba7bfcac59a1ef247d7bb9d4eba79bae9f64bb349ce0b722e1c5c1f686500\": container with ID starting with 655ba7bfcac59a1ef247d7bb9d4eba79bae9f64bb349ce0b722e1c5c1f686500 not found: ID does not exist" containerID="655ba7bfcac59a1ef247d7bb9d4eba79bae9f64bb349ce0b722e1c5c1f686500" Apr 22 18:43:30.397097 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.397010 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"655ba7bfcac59a1ef247d7bb9d4eba79bae9f64bb349ce0b722e1c5c1f686500"} err="failed to get container status \"655ba7bfcac59a1ef247d7bb9d4eba79bae9f64bb349ce0b722e1c5c1f686500\": rpc error: code = NotFound desc = could not find container \"655ba7bfcac59a1ef247d7bb9d4eba79bae9f64bb349ce0b722e1c5c1f686500\": container with ID starting with 655ba7bfcac59a1ef247d7bb9d4eba79bae9f64bb349ce0b722e1c5c1f686500 not found: ID does not exist" Apr 22 18:43:30.417651 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.417619 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-xn97h"] Apr 22 18:43:30.432424 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.432392 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-xn97h"] Apr 22 18:43:30.934519 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:30.934447 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cdb24d7-91fc-426a-9656-6d7eae83d596" path="/var/lib/kubelet/pods/9cdb24d7-91fc-426a-9656-6d7eae83d596/volumes" Apr 22 18:43:31.394260 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:31.394220 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-d9c56dd68-wf2bl" event={"ID":"fa05c060-3d04-4ea5-a0c7-1529891918be","Type":"ContainerStarted","Data":"bd31f89d1e8b00fd5b2f9de088d40e453f1718f4858172f9b78e24ff709b9350"} Apr 22 18:43:31.394468 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:31.394309 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-d9c56dd68-wf2bl" Apr 22 18:43:31.420811 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:43:31.420759 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-d9c56dd68-wf2bl" podStartSLOduration=1.893700746 podStartE2EDuration="2.420744741s" podCreationTimestamp="2026-04-22 18:43:29 +0000 UTC" firstStartedPulling="2026-04-22 18:43:30.339993512 +0000 UTC m=+406.047215919" lastFinishedPulling="2026-04-22 18:43:30.867037504 +0000 UTC m=+406.574259914" observedRunningTime="2026-04-22 18:43:31.418279604 +0000 UTC m=+407.125502032" watchObservedRunningTime="2026-04-22 18:43:31.420744741 +0000 UTC m=+407.127967172" Apr 22 18:44:02.404065 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:02.403987 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-d9c56dd68-wf2bl" Apr 22 18:44:03.340034 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.339997 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-bflvc"] Apr 22 18:44:03.340367 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.340354 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9cdb24d7-91fc-426a-9656-6d7eae83d596" containerName="manager" Apr 22 18:44:03.340436 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.340369 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cdb24d7-91fc-426a-9656-6d7eae83d596" containerName="manager" Apr 22 18:44:03.340436 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.340429 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="9cdb24d7-91fc-426a-9656-6d7eae83d596" containerName="manager" Apr 22 18:44:03.343375 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.343355 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-bflvc" Apr 22 18:44:03.346121 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.346098 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 18:44:03.346242 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.346105 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-x54bb\"" Apr 22 18:44:03.352610 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.352585 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-bflvc"] Apr 22 18:44:03.357637 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.357616 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-c7rr4"] Apr 22 18:44:03.361054 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.361038 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-c7rr4" Apr 22 18:44:03.363833 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.363813 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 18:44:03.363917 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.363856 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-jpflf\"" Apr 22 18:44:03.372223 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.372198 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-c7rr4"] Apr 22 18:44:03.424829 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.424793 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cba09ab0-c861-4450-a984-c3deb580a7e2-tls-certs\") pod \"model-serving-api-86f7b4b499-bflvc\" (UID: \"cba09ab0-c861-4450-a984-c3deb580a7e2\") " pod="kserve/model-serving-api-86f7b4b499-bflvc" Apr 22 18:44:03.424829 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.424833 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4eb6ad7-2801-4581-8f89-a322e7d998b4-cert\") pod \"odh-model-controller-696fc77849-c7rr4\" (UID: \"a4eb6ad7-2801-4581-8f89-a322e7d998b4\") " pod="kserve/odh-model-controller-696fc77849-c7rr4" Apr 22 18:44:03.425258 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.424858 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8r8p\" (UniqueName: \"kubernetes.io/projected/a4eb6ad7-2801-4581-8f89-a322e7d998b4-kube-api-access-n8r8p\") pod \"odh-model-controller-696fc77849-c7rr4\" (UID: \"a4eb6ad7-2801-4581-8f89-a322e7d998b4\") " pod="kserve/odh-model-controller-696fc77849-c7rr4" Apr 22 18:44:03.425258 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.424953 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwfmb\" (UniqueName: \"kubernetes.io/projected/cba09ab0-c861-4450-a984-c3deb580a7e2-kube-api-access-cwfmb\") pod \"model-serving-api-86f7b4b499-bflvc\" (UID: \"cba09ab0-c861-4450-a984-c3deb580a7e2\") " pod="kserve/model-serving-api-86f7b4b499-bflvc" Apr 22 18:44:03.525415 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.525380 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cba09ab0-c861-4450-a984-c3deb580a7e2-tls-certs\") pod \"model-serving-api-86f7b4b499-bflvc\" (UID: \"cba09ab0-c861-4450-a984-c3deb580a7e2\") " pod="kserve/model-serving-api-86f7b4b499-bflvc" Apr 22 18:44:03.525594 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.525423 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4eb6ad7-2801-4581-8f89-a322e7d998b4-cert\") pod \"odh-model-controller-696fc77849-c7rr4\" (UID: \"a4eb6ad7-2801-4581-8f89-a322e7d998b4\") " pod="kserve/odh-model-controller-696fc77849-c7rr4" Apr 22 18:44:03.525594 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.525462 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8r8p\" (UniqueName: \"kubernetes.io/projected/a4eb6ad7-2801-4581-8f89-a322e7d998b4-kube-api-access-n8r8p\") pod \"odh-model-controller-696fc77849-c7rr4\" (UID: \"a4eb6ad7-2801-4581-8f89-a322e7d998b4\") " pod="kserve/odh-model-controller-696fc77849-c7rr4" Apr 22 18:44:03.525594 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.525495 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwfmb\" (UniqueName: \"kubernetes.io/projected/cba09ab0-c861-4450-a984-c3deb580a7e2-kube-api-access-cwfmb\") pod \"model-serving-api-86f7b4b499-bflvc\" (UID: \"cba09ab0-c861-4450-a984-c3deb580a7e2\") " pod="kserve/model-serving-api-86f7b4b499-bflvc" Apr 22 18:44:03.525594 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:44:03.525532 2575 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 22 18:44:03.525594 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:44:03.525582 2575 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 18:44:03.525847 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:44:03.525592 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cba09ab0-c861-4450-a984-c3deb580a7e2-tls-certs podName:cba09ab0-c861-4450-a984-c3deb580a7e2 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:04.02557298 +0000 UTC m=+439.732795402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/cba09ab0-c861-4450-a984-c3deb580a7e2-tls-certs") pod "model-serving-api-86f7b4b499-bflvc" (UID: "cba09ab0-c861-4450-a984-c3deb580a7e2") : secret "model-serving-api-tls" not found Apr 22 18:44:03.525847 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:44:03.525647 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4eb6ad7-2801-4581-8f89-a322e7d998b4-cert podName:a4eb6ad7-2801-4581-8f89-a322e7d998b4 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:04.025630452 +0000 UTC m=+439.732852863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a4eb6ad7-2801-4581-8f89-a322e7d998b4-cert") pod "odh-model-controller-696fc77849-c7rr4" (UID: "a4eb6ad7-2801-4581-8f89-a322e7d998b4") : secret "odh-model-controller-webhook-cert" not found Apr 22 18:44:03.537051 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.537027 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8r8p\" (UniqueName: \"kubernetes.io/projected/a4eb6ad7-2801-4581-8f89-a322e7d998b4-kube-api-access-n8r8p\") pod \"odh-model-controller-696fc77849-c7rr4\" (UID: \"a4eb6ad7-2801-4581-8f89-a322e7d998b4\") " pod="kserve/odh-model-controller-696fc77849-c7rr4" Apr 22 18:44:03.537175 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:03.537142 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwfmb\" (UniqueName: \"kubernetes.io/projected/cba09ab0-c861-4450-a984-c3deb580a7e2-kube-api-access-cwfmb\") pod \"model-serving-api-86f7b4b499-bflvc\" (UID: \"cba09ab0-c861-4450-a984-c3deb580a7e2\") " pod="kserve/model-serving-api-86f7b4b499-bflvc" Apr 22 18:44:04.030354 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:04.030314 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cba09ab0-c861-4450-a984-c3deb580a7e2-tls-certs\") pod \"model-serving-api-86f7b4b499-bflvc\" (UID: \"cba09ab0-c861-4450-a984-c3deb580a7e2\") " pod="kserve/model-serving-api-86f7b4b499-bflvc" Apr 22 18:44:04.030654 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:04.030631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4eb6ad7-2801-4581-8f89-a322e7d998b4-cert\") pod \"odh-model-controller-696fc77849-c7rr4\" (UID: \"a4eb6ad7-2801-4581-8f89-a322e7d998b4\") " pod="kserve/odh-model-controller-696fc77849-c7rr4" Apr 22 18:44:04.033015 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:04.032987 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cba09ab0-c861-4450-a984-c3deb580a7e2-tls-certs\") pod \"model-serving-api-86f7b4b499-bflvc\" (UID: \"cba09ab0-c861-4450-a984-c3deb580a7e2\") " pod="kserve/model-serving-api-86f7b4b499-bflvc" Apr 22 18:44:04.033111 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:04.033033 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a4eb6ad7-2801-4581-8f89-a322e7d998b4-cert\") pod \"odh-model-controller-696fc77849-c7rr4\" (UID: \"a4eb6ad7-2801-4581-8f89-a322e7d998b4\") " pod="kserve/odh-model-controller-696fc77849-c7rr4" Apr 22 18:44:04.254448 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:04.254405 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-bflvc" Apr 22 18:44:04.273251 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:04.273213 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-c7rr4" Apr 22 18:44:04.390534 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:04.390508 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-bflvc"] Apr 22 18:44:04.393458 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:44:04.393426 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcba09ab0_c861_4450_a984_c3deb580a7e2.slice/crio-78bc9ec3d0d09489580f38e682b770c2f3792478fcc52e309aee7f4ba3f79f8b WatchSource:0}: Error finding container 78bc9ec3d0d09489580f38e682b770c2f3792478fcc52e309aee7f4ba3f79f8b: Status 404 returned error can't find the container with id 78bc9ec3d0d09489580f38e682b770c2f3792478fcc52e309aee7f4ba3f79f8b Apr 22 18:44:04.413042 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:04.413019 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-c7rr4"] Apr 22 18:44:04.416057 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:44:04.416029 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4eb6ad7_2801_4581_8f89_a322e7d998b4.slice/crio-15594cb95a66de43bd9012b702b1a309d7d9f9e253fc603c5eca013804ad3ef9 WatchSource:0}: Error finding container 15594cb95a66de43bd9012b702b1a309d7d9f9e253fc603c5eca013804ad3ef9: Status 404 returned error can't find the container with id 15594cb95a66de43bd9012b702b1a309d7d9f9e253fc603c5eca013804ad3ef9 Apr 22 18:44:04.507505 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:04.507472 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-c7rr4" event={"ID":"a4eb6ad7-2801-4581-8f89-a322e7d998b4","Type":"ContainerStarted","Data":"15594cb95a66de43bd9012b702b1a309d7d9f9e253fc603c5eca013804ad3ef9"} Apr 22 18:44:04.508483 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:04.508455 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-bflvc" event={"ID":"cba09ab0-c861-4450-a984-c3deb580a7e2","Type":"ContainerStarted","Data":"78bc9ec3d0d09489580f38e682b770c2f3792478fcc52e309aee7f4ba3f79f8b"} Apr 22 18:44:07.522672 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:07.522639 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-c7rr4" event={"ID":"a4eb6ad7-2801-4581-8f89-a322e7d998b4","Type":"ContainerStarted","Data":"f4c4dd371c8decd4c15114cdb3c4d769399267fc7bd5c339bec70aa18bf353df"} Apr 22 18:44:07.523223 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:07.522719 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-c7rr4" Apr 22 18:44:07.524018 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:07.523994 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-bflvc" event={"ID":"cba09ab0-c861-4450-a984-c3deb580a7e2","Type":"ContainerStarted","Data":"009e76e8376840dae55a67a175615366accfe980d7d2677d124a609f37a220e6"} Apr 22 18:44:07.524160 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:07.524147 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-bflvc" Apr 22 18:44:07.542011 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:07.541955 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-c7rr4" podStartSLOduration=1.5128045289999998 podStartE2EDuration="4.541942303s" podCreationTimestamp="2026-04-22 18:44:03 +0000 UTC" firstStartedPulling="2026-04-22 18:44:04.41728102 +0000 UTC m=+440.124503425" lastFinishedPulling="2026-04-22 18:44:07.446418777 +0000 UTC m=+443.153641199" observedRunningTime="2026-04-22 18:44:07.540112934 +0000 UTC m=+443.247335374" watchObservedRunningTime="2026-04-22 18:44:07.541942303 +0000 UTC m=+443.249164725" Apr 22 18:44:07.558007 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:07.557960 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-bflvc" podStartSLOduration=1.5539089179999999 podStartE2EDuration="4.557916789s" podCreationTimestamp="2026-04-22 18:44:03 +0000 UTC" firstStartedPulling="2026-04-22 18:44:04.395551706 +0000 UTC m=+440.102774111" lastFinishedPulling="2026-04-22 18:44:07.399559576 +0000 UTC m=+443.106781982" observedRunningTime="2026-04-22 18:44:07.55702134 +0000 UTC m=+443.264243769" watchObservedRunningTime="2026-04-22 18:44:07.557916789 +0000 UTC m=+443.265139217" Apr 22 18:44:10.440158 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.440126 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5dbc864794-8qh9c"] Apr 22 18:44:10.443549 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.443530 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.455188 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.455167 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dbc864794-8qh9c"] Apr 22 18:44:10.492620 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.492584 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8189a3b-5e91-4a92-b17d-7244425ff390-service-ca\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.492620 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.492624 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8189a3b-5e91-4a92-b17d-7244425ff390-oauth-serving-cert\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.492851 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.492680 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8189a3b-5e91-4a92-b17d-7244425ff390-trusted-ca-bundle\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.492851 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.492717 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2286f\" (UniqueName: \"kubernetes.io/projected/e8189a3b-5e91-4a92-b17d-7244425ff390-kube-api-access-2286f\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.492851 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.492793 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8189a3b-5e91-4a92-b17d-7244425ff390-console-config\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.492851 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.492837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8189a3b-5e91-4a92-b17d-7244425ff390-console-serving-cert\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.493054 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.492865 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8189a3b-5e91-4a92-b17d-7244425ff390-console-oauth-config\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.594355 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.594320 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8189a3b-5e91-4a92-b17d-7244425ff390-console-config\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.594355 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.594360 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8189a3b-5e91-4a92-b17d-7244425ff390-console-serving-cert\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.594555 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.594383 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8189a3b-5e91-4a92-b17d-7244425ff390-console-oauth-config\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.594555 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.594495 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8189a3b-5e91-4a92-b17d-7244425ff390-service-ca\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.594555 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.594550 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8189a3b-5e91-4a92-b17d-7244425ff390-oauth-serving-cert\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.594680 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.594587 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8189a3b-5e91-4a92-b17d-7244425ff390-trusted-ca-bundle\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.594680 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.594620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2286f\" (UniqueName: \"kubernetes.io/projected/e8189a3b-5e91-4a92-b17d-7244425ff390-kube-api-access-2286f\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.595412 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.595380 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e8189a3b-5e91-4a92-b17d-7244425ff390-oauth-serving-cert\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.595412 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.595406 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e8189a3b-5e91-4a92-b17d-7244425ff390-console-config\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.595573 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.595475 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8189a3b-5e91-4a92-b17d-7244425ff390-trusted-ca-bundle\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.595573 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.595490 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e8189a3b-5e91-4a92-b17d-7244425ff390-service-ca\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.597042 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.597017 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e8189a3b-5e91-4a92-b17d-7244425ff390-console-oauth-config\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.597134 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.597022 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e8189a3b-5e91-4a92-b17d-7244425ff390-console-serving-cert\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.604870 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.604847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2286f\" (UniqueName: \"kubernetes.io/projected/e8189a3b-5e91-4a92-b17d-7244425ff390-kube-api-access-2286f\") pod \"console-5dbc864794-8qh9c\" (UID: \"e8189a3b-5e91-4a92-b17d-7244425ff390\") " pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.753150 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.753049 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:10.894470 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:10.894442 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dbc864794-8qh9c"] Apr 22 18:44:10.896444 ip-10-0-129-142 kubenswrapper[2575]: W0422 18:44:10.896417 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8189a3b_5e91_4a92_b17d_7244425ff390.slice/crio-eac0ac74e13deaf10eb1b63dd1c22fa16377c7fe95fb675438fcd05c2dccd96e WatchSource:0}: Error finding container eac0ac74e13deaf10eb1b63dd1c22fa16377c7fe95fb675438fcd05c2dccd96e: Status 404 returned error can't find the container with id eac0ac74e13deaf10eb1b63dd1c22fa16377c7fe95fb675438fcd05c2dccd96e Apr 22 18:44:11.539901 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:11.539862 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dbc864794-8qh9c" event={"ID":"e8189a3b-5e91-4a92-b17d-7244425ff390","Type":"ContainerStarted","Data":"a70cb5eda3cb4426a6f5e16db9466bd9a8c31b0ca060ac2642cac766f773b7d4"} Apr 22 18:44:11.539901 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:11.539904 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dbc864794-8qh9c" event={"ID":"e8189a3b-5e91-4a92-b17d-7244425ff390","Type":"ContainerStarted","Data":"eac0ac74e13deaf10eb1b63dd1c22fa16377c7fe95fb675438fcd05c2dccd96e"} Apr 22 18:44:11.560558 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:11.560504 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5dbc864794-8qh9c" podStartSLOduration=1.560489278 podStartE2EDuration="1.560489278s" podCreationTimestamp="2026-04-22 18:44:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:44:11.558698284 +0000 UTC m=+447.265920723" watchObservedRunningTime="2026-04-22 18:44:11.560489278 +0000 UTC m=+447.267711705" Apr 22 18:44:18.530958 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:18.530886 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-c7rr4" Apr 22 18:44:18.532898 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:18.532877 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-bflvc" Apr 22 18:44:20.753547 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:20.753485 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:20.754011 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:20.753566 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:20.758284 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:20.758262 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:21.579131 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:21.579098 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5dbc864794-8qh9c" Apr 22 18:44:21.630898 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:21.630851 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bcbcc4984-647zw"] Apr 22 18:44:46.651216 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.651173 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-bcbcc4984-647zw" podUID="829141e9-7509-4f9e-aca6-6fe11a0ed5b0" containerName="console" containerID="cri-o://39bf8fb7a1f686affdc861b87a7c2b18a1e80ed7af8a056ea999f23b50409de6" gracePeriod=15 Apr 22 18:44:46.892834 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.892811 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bcbcc4984-647zw_829141e9-7509-4f9e-aca6-6fe11a0ed5b0/console/0.log" Apr 22 18:44:46.892975 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.892871 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:44:46.920117 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.920042 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-console-serving-cert\") pod \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " Apr 22 18:44:46.920117 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.920078 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-service-ca\") pod \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " Apr 22 18:44:46.920117 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.920105 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-console-config\") pod \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " Apr 22 18:44:46.920366 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.920134 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h48lg\" (UniqueName: \"kubernetes.io/projected/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-kube-api-access-h48lg\") pod \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " Apr 22 18:44:46.920366 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.920170 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-trusted-ca-bundle\") pod \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " Apr 22 18:44:46.920366 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.920237 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-oauth-serving-cert\") pod \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " Apr 22 18:44:46.920366 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.920283 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-console-oauth-config\") pod \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\" (UID: \"829141e9-7509-4f9e-aca6-6fe11a0ed5b0\") " Apr 22 18:44:46.920564 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.920498 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-service-ca" (OuterVolumeSpecName: "service-ca") pod "829141e9-7509-4f9e-aca6-6fe11a0ed5b0" (UID: "829141e9-7509-4f9e-aca6-6fe11a0ed5b0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:44:46.920564 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.920503 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-console-config" (OuterVolumeSpecName: "console-config") pod "829141e9-7509-4f9e-aca6-6fe11a0ed5b0" (UID: "829141e9-7509-4f9e-aca6-6fe11a0ed5b0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:44:46.920708 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.920685 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-service-ca\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:44:46.920767 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.920711 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-console-config\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:44:46.920823 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.920792 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "829141e9-7509-4f9e-aca6-6fe11a0ed5b0" (UID: "829141e9-7509-4f9e-aca6-6fe11a0ed5b0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:44:46.920875 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.920839 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "829141e9-7509-4f9e-aca6-6fe11a0ed5b0" (UID: "829141e9-7509-4f9e-aca6-6fe11a0ed5b0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:44:46.922801 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.922768 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "829141e9-7509-4f9e-aca6-6fe11a0ed5b0" (UID: "829141e9-7509-4f9e-aca6-6fe11a0ed5b0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:44:46.922978 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.922895 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-kube-api-access-h48lg" (OuterVolumeSpecName: "kube-api-access-h48lg") pod "829141e9-7509-4f9e-aca6-6fe11a0ed5b0" (UID: "829141e9-7509-4f9e-aca6-6fe11a0ed5b0"). InnerVolumeSpecName "kube-api-access-h48lg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:44:46.923112 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:46.923094 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "829141e9-7509-4f9e-aca6-6fe11a0ed5b0" (UID: "829141e9-7509-4f9e-aca6-6fe11a0ed5b0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:44:47.021273 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:47.021242 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-oauth-serving-cert\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:44:47.021273 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:47.021275 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-console-oauth-config\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:44:47.021273 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:47.021289 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-console-serving-cert\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:44:47.021527 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:47.021304 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h48lg\" (UniqueName: \"kubernetes.io/projected/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-kube-api-access-h48lg\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:44:47.021527 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:47.021317 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/829141e9-7509-4f9e-aca6-6fe11a0ed5b0-trusted-ca-bundle\") on node \"ip-10-0-129-142.ec2.internal\" DevicePath \"\"" Apr 22 18:44:47.666793 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:47.666763 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bcbcc4984-647zw_829141e9-7509-4f9e-aca6-6fe11a0ed5b0/console/0.log" Apr 22 18:44:47.667256 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:47.666803 2575 generic.go:358] "Generic (PLEG): container finished" podID="829141e9-7509-4f9e-aca6-6fe11a0ed5b0" containerID="39bf8fb7a1f686affdc861b87a7c2b18a1e80ed7af8a056ea999f23b50409de6" exitCode=2 Apr 22 18:44:47.667256 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:47.666855 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bcbcc4984-647zw" event={"ID":"829141e9-7509-4f9e-aca6-6fe11a0ed5b0","Type":"ContainerDied","Data":"39bf8fb7a1f686affdc861b87a7c2b18a1e80ed7af8a056ea999f23b50409de6"} Apr 22 18:44:47.667256 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:47.666873 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bcbcc4984-647zw" Apr 22 18:44:47.667256 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:47.666882 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bcbcc4984-647zw" event={"ID":"829141e9-7509-4f9e-aca6-6fe11a0ed5b0","Type":"ContainerDied","Data":"d9bd508d8d3e3d3c7dbe4640d07d5ad206971626e44031b89f0aa0732d1badca"} Apr 22 18:44:47.667256 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:47.666898 2575 scope.go:117] "RemoveContainer" containerID="39bf8fb7a1f686affdc861b87a7c2b18a1e80ed7af8a056ea999f23b50409de6" Apr 22 18:44:47.675508 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:47.675478 2575 scope.go:117] "RemoveContainer" containerID="39bf8fb7a1f686affdc861b87a7c2b18a1e80ed7af8a056ea999f23b50409de6" Apr 22 18:44:47.675762 ip-10-0-129-142 kubenswrapper[2575]: E0422 18:44:47.675745 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39bf8fb7a1f686affdc861b87a7c2b18a1e80ed7af8a056ea999f23b50409de6\": container with ID starting with 39bf8fb7a1f686affdc861b87a7c2b18a1e80ed7af8a056ea999f23b50409de6 not found: ID does not exist" containerID="39bf8fb7a1f686affdc861b87a7c2b18a1e80ed7af8a056ea999f23b50409de6" Apr 22 18:44:47.675824 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:47.675772 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39bf8fb7a1f686affdc861b87a7c2b18a1e80ed7af8a056ea999f23b50409de6"} err="failed to get container status \"39bf8fb7a1f686affdc861b87a7c2b18a1e80ed7af8a056ea999f23b50409de6\": rpc error: code = NotFound desc = could not find container \"39bf8fb7a1f686affdc861b87a7c2b18a1e80ed7af8a056ea999f23b50409de6\": container with ID starting with 39bf8fb7a1f686affdc861b87a7c2b18a1e80ed7af8a056ea999f23b50409de6 not found: ID does not exist" Apr 22 18:44:47.688565 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:47.688538 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bcbcc4984-647zw"] Apr 22 18:44:47.695187 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:47.695164 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-bcbcc4984-647zw"] Apr 22 18:44:48.935106 ip-10-0-129-142 kubenswrapper[2575]: I0422 18:44:48.935072 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="829141e9-7509-4f9e-aca6-6fe11a0ed5b0" path="/var/lib/kubelet/pods/829141e9-7509-4f9e-aca6-6fe11a0ed5b0/volumes" Apr 22 19:44:41.963675 ip-10-0-129-142 kubenswrapper[2575]: I0422 19:44:41.963635 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-d9c56dd68-wf2bl_fa05c060-3d04-4ea5-a0c7-1529891918be/manager/0.log" Apr 22 19:47:42.460000 ip-10-0-129-142 kubenswrapper[2575]: I0422 19:47:42.459969 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-d9c56dd68-wf2bl_fa05c060-3d04-4ea5-a0c7-1529891918be/manager/0.log" Apr 22 20:08:12.996084 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:12.996003 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9z6ln_f02372ee-17cf-4f7b-9ae1-04ef56a6c468/global-pull-secret-syncer/0.log" Apr 22 20:08:13.163556 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:13.163500 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-tdjfh_e513b58f-d876-4d46-a313-2450da792164/konnectivity-agent/0.log" Apr 22 20:08:13.257429 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:13.257310 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-142.ec2.internal_8c3f3197044afe14bb61b56250727776/haproxy/0.log" Apr 22 20:08:16.426852 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:16.426818 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c70d334e-87f8-4f5e-a03b-6bd1312b7570/alertmanager/0.log" Apr 22 20:08:16.455058 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:16.455025 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c70d334e-87f8-4f5e-a03b-6bd1312b7570/config-reloader/0.log" Apr 22 20:08:16.486792 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:16.486767 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c70d334e-87f8-4f5e-a03b-6bd1312b7570/kube-rbac-proxy-web/0.log" Apr 22 20:08:16.515173 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:16.515149 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c70d334e-87f8-4f5e-a03b-6bd1312b7570/kube-rbac-proxy/0.log" Apr 22 20:08:16.542119 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:16.542084 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c70d334e-87f8-4f5e-a03b-6bd1312b7570/kube-rbac-proxy-metric/0.log" Apr 22 20:08:16.568712 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:16.568684 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c70d334e-87f8-4f5e-a03b-6bd1312b7570/prom-label-proxy/0.log" Apr 22 20:08:16.596279 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:16.596238 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c70d334e-87f8-4f5e-a03b-6bd1312b7570/init-config-reloader/0.log" Apr 22 20:08:16.835600 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:16.835490 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-zzrnk_e60c41f2-43ea-4538-b1cf-948a3af04d39/kube-state-metrics/0.log" Apr 22 20:08:16.860812 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:16.860782 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-zzrnk_e60c41f2-43ea-4538-b1cf-948a3af04d39/kube-rbac-proxy-main/0.log" Apr 22 20:08:16.886010 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:16.885981 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-zzrnk_e60c41f2-43ea-4538-b1cf-948a3af04d39/kube-rbac-proxy-self/0.log" Apr 22 20:08:16.915851 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:16.915824 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7df4869b5b-rkw66_e6dcfb9f-eef0-420b-a3c2-00d4b1f219dd/metrics-server/0.log" Apr 22 20:08:16.942118 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:16.942088 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-cgrz7_f55f170e-96d3-496c-8e6a-ed27167d9344/monitoring-plugin/0.log" Apr 22 20:08:16.977474 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:16.977430 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4pgk8_cee7930e-83ce-4067-9398-494f9751737a/node-exporter/0.log" Apr 22 20:08:17.001288 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:17.001258 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4pgk8_cee7930e-83ce-4067-9398-494f9751737a/kube-rbac-proxy/0.log" Apr 22 20:08:17.030087 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:17.030060 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4pgk8_cee7930e-83ce-4067-9398-494f9751737a/init-textfile/0.log" Apr 22 20:08:17.259888 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:17.259859 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-hzthd_bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e/kube-rbac-proxy-main/0.log" Apr 22 20:08:17.284664 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:17.284633 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-hzthd_bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e/kube-rbac-proxy-self/0.log" Apr 22 20:08:17.308735 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:17.308707 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-hzthd_bd1bf2b3-bfa7-4038-bcc0-3fe40aa48c6e/openshift-state-metrics/0.log" Apr 22 20:08:17.543806 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:17.543716 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-ddhgb_39e00ba2-0cd0-486e-be96-48d1b4f2448f/prometheus-operator/0.log" Apr 22 20:08:17.568406 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:17.568379 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-ddhgb_39e00ba2-0cd0-486e-be96-48d1b4f2448f/kube-rbac-proxy/0.log" Apr 22 20:08:17.627954 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:17.627905 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d8cf9c78b-v9ss6_95404392-caeb-4280-9b51-cec352e540b0/telemeter-client/0.log" Apr 22 20:08:17.662624 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:17.662595 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d8cf9c78b-v9ss6_95404392-caeb-4280-9b51-cec352e540b0/reload/0.log" Apr 22 20:08:17.691549 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:17.691507 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d8cf9c78b-v9ss6_95404392-caeb-4280-9b51-cec352e540b0/kube-rbac-proxy/0.log" Apr 22 20:08:17.739866 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:17.739834 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7ffccd8cb5-bkzp6_3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909/thanos-query/0.log" Apr 22 20:08:17.776038 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:17.775995 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7ffccd8cb5-bkzp6_3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909/kube-rbac-proxy-web/0.log" Apr 22 20:08:17.811303 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:17.811201 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7ffccd8cb5-bkzp6_3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909/kube-rbac-proxy/0.log" Apr 22 20:08:17.841510 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:17.841484 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7ffccd8cb5-bkzp6_3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909/prom-label-proxy/0.log" Apr 22 20:08:17.868323 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:17.868273 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7ffccd8cb5-bkzp6_3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909/kube-rbac-proxy-rules/0.log" Apr 22 20:08:17.897898 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:17.897860 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7ffccd8cb5-bkzp6_3578fdd1-e2ae-4aa0-9d35-5dd1f6de3909/kube-rbac-proxy-metrics/0.log" Apr 22 20:08:19.729222 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.729189 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5dbc864794-8qh9c_e8189a3b-5e91-4a92-b17d-7244425ff390/console/0.log" Apr 22 20:08:19.854169 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.854134 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472"] Apr 22 20:08:19.854575 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.854560 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="829141e9-7509-4f9e-aca6-6fe11a0ed5b0" containerName="console" Apr 22 20:08:19.854634 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.854576 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="829141e9-7509-4f9e-aca6-6fe11a0ed5b0" containerName="console" Apr 22 20:08:19.854675 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.854646 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="829141e9-7509-4f9e-aca6-6fe11a0ed5b0" containerName="console" Apr 22 20:08:19.857978 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.857950 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:19.860648 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.860623 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-k2gsp\"/\"kube-root-ca.crt\"" Apr 22 20:08:19.860792 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.860623 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-k2gsp\"/\"openshift-service-ca.crt\"" Apr 22 20:08:19.861861 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.861845 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-k2gsp\"/\"default-dockercfg-dszbq\"" Apr 22 20:08:19.867357 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.867331 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472"] Apr 22 20:08:19.879699 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.879666 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9040e742-eae4-4402-8c9d-0b8a111cf35f-sys\") pod \"perf-node-gather-daemonset-nh472\" (UID: \"9040e742-eae4-4402-8c9d-0b8a111cf35f\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:19.879857 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.879713 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rwls\" (UniqueName: \"kubernetes.io/projected/9040e742-eae4-4402-8c9d-0b8a111cf35f-kube-api-access-7rwls\") pod \"perf-node-gather-daemonset-nh472\" (UID: \"9040e742-eae4-4402-8c9d-0b8a111cf35f\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:19.879857 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.879734 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9040e742-eae4-4402-8c9d-0b8a111cf35f-proc\") pod \"perf-node-gather-daemonset-nh472\" (UID: \"9040e742-eae4-4402-8c9d-0b8a111cf35f\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:19.879857 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.879833 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9040e742-eae4-4402-8c9d-0b8a111cf35f-podres\") pod \"perf-node-gather-daemonset-nh472\" (UID: \"9040e742-eae4-4402-8c9d-0b8a111cf35f\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:19.879996 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.879881 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9040e742-eae4-4402-8c9d-0b8a111cf35f-lib-modules\") pod \"perf-node-gather-daemonset-nh472\" (UID: \"9040e742-eae4-4402-8c9d-0b8a111cf35f\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:19.981191 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.981096 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9040e742-eae4-4402-8c9d-0b8a111cf35f-podres\") pod \"perf-node-gather-daemonset-nh472\" (UID: \"9040e742-eae4-4402-8c9d-0b8a111cf35f\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:19.981191 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.981145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9040e742-eae4-4402-8c9d-0b8a111cf35f-lib-modules\") pod \"perf-node-gather-daemonset-nh472\" (UID: \"9040e742-eae4-4402-8c9d-0b8a111cf35f\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:19.981380 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.981222 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9040e742-eae4-4402-8c9d-0b8a111cf35f-sys\") pod \"perf-node-gather-daemonset-nh472\" (UID: \"9040e742-eae4-4402-8c9d-0b8a111cf35f\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:19.981380 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.981262 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rwls\" (UniqueName: \"kubernetes.io/projected/9040e742-eae4-4402-8c9d-0b8a111cf35f-kube-api-access-7rwls\") pod \"perf-node-gather-daemonset-nh472\" (UID: \"9040e742-eae4-4402-8c9d-0b8a111cf35f\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:19.981380 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.981274 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9040e742-eae4-4402-8c9d-0b8a111cf35f-lib-modules\") pod \"perf-node-gather-daemonset-nh472\" (UID: \"9040e742-eae4-4402-8c9d-0b8a111cf35f\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:19.981380 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.981291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9040e742-eae4-4402-8c9d-0b8a111cf35f-proc\") pod \"perf-node-gather-daemonset-nh472\" (UID: \"9040e742-eae4-4402-8c9d-0b8a111cf35f\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:19.981380 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.981333 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9040e742-eae4-4402-8c9d-0b8a111cf35f-sys\") pod \"perf-node-gather-daemonset-nh472\" (UID: \"9040e742-eae4-4402-8c9d-0b8a111cf35f\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:19.981380 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.981273 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9040e742-eae4-4402-8c9d-0b8a111cf35f-podres\") pod \"perf-node-gather-daemonset-nh472\" (UID: \"9040e742-eae4-4402-8c9d-0b8a111cf35f\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:19.981380 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.981381 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9040e742-eae4-4402-8c9d-0b8a111cf35f-proc\") pod \"perf-node-gather-daemonset-nh472\" (UID: \"9040e742-eae4-4402-8c9d-0b8a111cf35f\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:19.991232 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:19.991198 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rwls\" (UniqueName: \"kubernetes.io/projected/9040e742-eae4-4402-8c9d-0b8a111cf35f-kube-api-access-7rwls\") pod \"perf-node-gather-daemonset-nh472\" (UID: \"9040e742-eae4-4402-8c9d-0b8a111cf35f\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:20.169669 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:20.169620 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:20.299162 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:20.299133 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472"] Apr 22 20:08:20.301397 ip-10-0-129-142 kubenswrapper[2575]: W0422 20:08:20.301368 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9040e742_eae4_4402_8c9d_0b8a111cf35f.slice/crio-7f8d4714a35e67dd2f7a1f7e42b96a91d0cad9953a13436e36f56e7a7f7ec3bb WatchSource:0}: Error finding container 7f8d4714a35e67dd2f7a1f7e42b96a91d0cad9953a13436e36f56e7a7f7ec3bb: Status 404 returned error can't find the container with id 7f8d4714a35e67dd2f7a1f7e42b96a91d0cad9953a13436e36f56e7a7f7ec3bb Apr 22 20:08:20.303051 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:20.303032 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:08:20.618232 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:20.618141 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" event={"ID":"9040e742-eae4-4402-8c9d-0b8a111cf35f","Type":"ContainerStarted","Data":"d7d19d63633e70967adece93245f4d3df222263ec56142659e7088faf257339b"} Apr 22 20:08:20.618232 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:20.618179 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" event={"ID":"9040e742-eae4-4402-8c9d-0b8a111cf35f","Type":"ContainerStarted","Data":"7f8d4714a35e67dd2f7a1f7e42b96a91d0cad9953a13436e36f56e7a7f7ec3bb"} Apr 22 20:08:20.618423 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:20.618252 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:20.635737 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:20.635687 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" podStartSLOduration=1.635672692 podStartE2EDuration="1.635672692s" podCreationTimestamp="2026-04-22 20:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:08:20.634888492 +0000 UTC m=+5496.342110922" watchObservedRunningTime="2026-04-22 20:08:20.635672692 +0000 UTC m=+5496.342895119" Apr 22 20:08:20.906840 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:20.906788 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nlsz6_43bb7de4-00c0-4077-a922-4be165674fa5/dns/0.log" Apr 22 20:08:20.934054 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:20.934015 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nlsz6_43bb7de4-00c0-4077-a922-4be165674fa5/kube-rbac-proxy/0.log" Apr 22 20:08:21.056491 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:21.056463 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vnqvq_6fd27c1a-c1d0-44cd-b22e-2ac77ce9dd17/dns-node-resolver/0.log" Apr 22 20:08:21.527579 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:21.527531 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-smstv_2b064dca-dd29-4d80-aaba-ffe324b856eb/node-ca/0.log" Apr 22 20:08:22.619872 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:22.619841 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-fcfqp_9a0dbd88-fa39-4383-b147-1af4bcb23b83/serve-healthcheck-canary/0.log" Apr 22 20:08:23.239975 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:23.239944 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ndm6t_d86ab162-e73e-453a-99c0-3c1735a2db66/kube-rbac-proxy/0.log" Apr 22 20:08:23.262599 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:23.262576 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ndm6t_d86ab162-e73e-453a-99c0-3c1735a2db66/exporter/0.log" Apr 22 20:08:23.285741 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:23.285718 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ndm6t_d86ab162-e73e-453a-99c0-3c1735a2db66/extractor/0.log" Apr 22 20:08:25.171236 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:25.171188 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-d9c56dd68-wf2bl_fa05c060-3d04-4ea5-a0c7-1529891918be/manager/0.log" Apr 22 20:08:25.193547 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:25.193515 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-gn6l8_6b959351-ba4a-4bfa-8aeb-9ba4e0e2ae69/manager/0.log" Apr 22 20:08:25.216653 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:25.216622 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-bflvc_cba09ab0-c861-4450-a984-c3deb580a7e2/server/0.log" Apr 22 20:08:25.276692 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:25.276648 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-c7rr4_a4eb6ad7-2801-4581-8f89-a322e7d998b4/manager/0.log" Apr 22 20:08:26.632312 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:26.632278 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-nh472" Apr 22 20:08:31.252724 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:31.252690 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-krfvd_90bf1722-06bc-4b40-aeb1-0f3d321b412b/kube-multus-additional-cni-plugins/0.log" Apr 22 20:08:31.286325 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:31.286296 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-krfvd_90bf1722-06bc-4b40-aeb1-0f3d321b412b/egress-router-binary-copy/0.log" Apr 22 20:08:31.313641 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:31.313612 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-krfvd_90bf1722-06bc-4b40-aeb1-0f3d321b412b/cni-plugins/0.log" Apr 22 20:08:31.343351 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:31.343322 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-krfvd_90bf1722-06bc-4b40-aeb1-0f3d321b412b/bond-cni-plugin/0.log" Apr 22 20:08:31.370181 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:31.370154 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-krfvd_90bf1722-06bc-4b40-aeb1-0f3d321b412b/routeoverride-cni/0.log" Apr 22 20:08:31.392973 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:31.392948 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-krfvd_90bf1722-06bc-4b40-aeb1-0f3d321b412b/whereabouts-cni-bincopy/0.log" Apr 22 20:08:31.419081 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:31.419055 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-krfvd_90bf1722-06bc-4b40-aeb1-0f3d321b412b/whereabouts-cni/0.log" Apr 22 20:08:31.484840 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:31.484809 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qjb4s_828d62db-71ec-4b5b-b36f-70124c98ab1b/kube-multus/0.log" Apr 22 20:08:31.626102 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:31.626030 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-sx6lq_261a5574-d8e0-4ff1-99d3-ac73ec229f92/network-metrics-daemon/0.log" Apr 22 20:08:31.678530 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:31.678503 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-sx6lq_261a5574-d8e0-4ff1-99d3-ac73ec229f92/kube-rbac-proxy/0.log" Apr 22 20:08:33.209114 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:33.209066 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zbqgd_08a25179-34e4-4328-9e3a-dfcc4b1aa177/ovn-controller/0.log" Apr 22 20:08:33.257208 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:33.257181 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zbqgd_08a25179-34e4-4328-9e3a-dfcc4b1aa177/ovn-acl-logging/0.log" Apr 22 20:08:33.274366 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:33.274342 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zbqgd_08a25179-34e4-4328-9e3a-dfcc4b1aa177/kube-rbac-proxy-node/0.log" Apr 22 20:08:33.295980 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:33.295958 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zbqgd_08a25179-34e4-4328-9e3a-dfcc4b1aa177/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 20:08:33.320237 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:33.320209 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zbqgd_08a25179-34e4-4328-9e3a-dfcc4b1aa177/northd/0.log" Apr 22 20:08:33.345621 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:33.345588 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zbqgd_08a25179-34e4-4328-9e3a-dfcc4b1aa177/nbdb/0.log" Apr 22 20:08:33.367186 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:33.367164 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zbqgd_08a25179-34e4-4328-9e3a-dfcc4b1aa177/sbdb/0.log" Apr 22 20:08:33.465455 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:33.465385 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zbqgd_08a25179-34e4-4328-9e3a-dfcc4b1aa177/ovnkube-controller/0.log" Apr 22 20:08:34.510483 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:34.510457 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-xg9qw_5664213f-7ddf-424a-af58-77eee7d05998/network-check-target-container/0.log" Apr 22 20:08:35.434751 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:35.434722 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-f4ls2_fb9d517f-20af-4295-b872-990841be4bb8/iptables-alerter/0.log" Apr 22 20:08:36.166123 ip-10-0-129-142 kubenswrapper[2575]: I0422 20:08:36.166096 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-vgxcc_bd94de2c-a623-493f-b731-6e8e75d73ccb/tuned/0.log"