Apr 22 21:08:38.944005 ip-10-0-138-20 systemd[1]: Starting Kubernetes Kubelet... Apr 22 21:08:39.474625 ip-10-0-138-20 kubenswrapper[2546]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 21:08:39.474625 ip-10-0-138-20 kubenswrapper[2546]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 21:08:39.474625 ip-10-0-138-20 kubenswrapper[2546]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 21:08:39.474625 ip-10-0-138-20 kubenswrapper[2546]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 21:08:39.474625 ip-10-0-138-20 kubenswrapper[2546]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 21:08:39.477448 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.477356 2546 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 21:08:39.482336 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482313 2546 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:08:39.482336 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482332 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:08:39.482336 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482336 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:08:39.482336 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482339 2546 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:08:39.482336 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482342 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:08:39.482336 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482345 2546 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482348 2546 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482351 2546 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482354 2546 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482356 2546 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482360 2546 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482363 2546 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482366 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482368 2546 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482371 2546 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482374 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482377 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482380 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482382 2546 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482385 2546 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482387 2546 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482390 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482393 2546 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482395 2546 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:08:39.482563 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482398 2546 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482400 2546 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482403 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482409 2546 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482412 2546 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482414 2546 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482417 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482420 2546 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482422 2546 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482424 2546 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482427 2546 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482430 2546 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482432 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482434 2546 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482437 2546 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482440 2546 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482443 2546 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482446 2546 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482448 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482451 2546 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:08:39.483067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482454 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482457 2546 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482459 2546 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482462 2546 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482465 2546 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482467 2546 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482471 2546 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482476 2546 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482480 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482482 2546 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482485 2546 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482487 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482490 2546 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482493 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482496 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482499 2546 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482501 2546 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482503 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482506 2546 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:08:39.483567 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482509 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482511 2546 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482514 2546 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482516 2546 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482519 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482522 2546 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482524 2546 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482526 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482531 2546 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482533 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482536 2546 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482539 2546 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482541 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482545 2546 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482548 2546 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482551 2546 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482556 2546 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482560 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482563 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482566 2546 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:08:39.484041 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482569 2546 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482572 2546 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482575 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482982 2546 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482988 2546 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482991 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482994 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.482997 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483000 2546 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483003 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483005 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483008 2546 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483011 2546 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483014 2546 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483016 2546 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483019 2546 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483022 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483024 2546 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483027 2546 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483029 2546 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:08:39.484546 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483033 2546 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483035 2546 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483038 2546 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483040 2546 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483043 2546 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483046 2546 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483048 2546 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483051 2546 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483054 2546 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483056 2546 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483059 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483062 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483064 2546 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483067 2546 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483070 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483072 2546 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483075 2546 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483078 2546 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483080 2546 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483083 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:08:39.485039 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483086 2546 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483088 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483091 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483093 2546 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483096 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483099 2546 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483102 2546 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483104 2546 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483106 2546 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483109 2546 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483111 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483115 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483118 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483122 2546 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483125 2546 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483128 2546 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483131 2546 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483133 2546 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483136 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483139 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:08:39.485538 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483142 2546 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483144 2546 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483147 2546 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483149 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483152 2546 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483154 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483157 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483159 2546 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483162 2546 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483165 2546 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483168 2546 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483171 2546 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483174 2546 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483176 2546 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483179 2546 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483181 2546 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483184 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483186 2546 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483189 2546 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:08:39.486084 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483191 2546 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483195 2546 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483197 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483200 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483202 2546 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483205 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483208 2546 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483210 2546 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483213 2546 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.483216 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484911 2546 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484921 2546 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484927 2546 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484932 2546 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484938 2546 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484942 2546 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484946 2546 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484951 2546 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484954 2546 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484957 2546 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484961 2546 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 21:08:39.486536 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484964 2546 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484967 2546 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484970 2546 flags.go:64] FLAG: --cgroup-root="" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484972 2546 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484975 2546 flags.go:64] FLAG: --client-ca-file="" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484978 2546 flags.go:64] FLAG: --cloud-config="" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484981 2546 flags.go:64] FLAG: --cloud-provider="external" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484983 2546 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484988 2546 flags.go:64] FLAG: --cluster-domain="" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484991 2546 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484994 2546 flags.go:64] FLAG: --config-dir="" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.484997 2546 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485000 2546 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485004 2546 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485007 2546 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485011 2546 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485014 2546 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485017 2546 flags.go:64] FLAG: --contention-profiling="false" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485020 2546 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485023 2546 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485026 2546 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485030 2546 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485034 2546 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485037 2546 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485041 2546 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 21:08:39.487112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485044 2546 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485048 2546 flags.go:64] FLAG: --enable-server="true" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485051 2546 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485055 2546 flags.go:64] FLAG: --event-burst="100" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485058 2546 flags.go:64] FLAG: --event-qps="50" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485062 2546 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485065 2546 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485067 2546 flags.go:64] FLAG: --eviction-hard="" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485071 2546 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485074 2546 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485077 2546 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485080 2546 flags.go:64] FLAG: --eviction-soft="" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485082 2546 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485085 2546 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485088 2546 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485091 2546 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485094 2546 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485097 2546 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485100 2546 flags.go:64] FLAG: --feature-gates="" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485104 2546 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485107 2546 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485110 2546 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485113 2546 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485116 2546 flags.go:64] FLAG: --healthz-port="10248" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485119 2546 flags.go:64] FLAG: --help="false" Apr 22 21:08:39.487765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485122 2546 flags.go:64] FLAG: --hostname-override="ip-10-0-138-20.ec2.internal" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485125 2546 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485128 2546 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485130 2546 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485134 2546 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485137 2546 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485139 2546 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485142 2546 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485145 2546 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485148 2546 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485151 2546 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485154 2546 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485156 2546 flags.go:64] FLAG: --kube-reserved="" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485159 2546 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485162 2546 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485165 2546 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485167 2546 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485170 2546 flags.go:64] FLAG: --lock-file="" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485172 2546 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485175 2546 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485178 2546 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485183 2546 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485186 2546 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485188 2546 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 21:08:39.488368 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485191 2546 flags.go:64] FLAG: --logging-format="text" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485194 2546 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485197 2546 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485200 2546 flags.go:64] FLAG: --manifest-url="" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485203 2546 flags.go:64] FLAG: --manifest-url-header="" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485207 2546 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485210 2546 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485214 2546 flags.go:64] FLAG: --max-pods="110" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485217 2546 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485220 2546 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485223 2546 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485226 2546 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485229 2546 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485232 2546 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485234 2546 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485242 2546 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485245 2546 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485248 2546 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485251 2546 flags.go:64] FLAG: --pod-cidr="" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485254 2546 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485259 2546 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485262 2546 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485265 2546 flags.go:64] FLAG: --pods-per-core="0" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485268 2546 flags.go:64] FLAG: --port="10250" Apr 22 21:08:39.488963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485271 2546 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485274 2546 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f2d4893fef7d3554" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485277 2546 flags.go:64] FLAG: --qos-reserved="" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485280 2546 flags.go:64] FLAG: --read-only-port="10255" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485283 2546 flags.go:64] FLAG: --register-node="true" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485285 2546 flags.go:64] FLAG: --register-schedulable="true" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485288 2546 flags.go:64] FLAG: --register-with-taints="" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485292 2546 flags.go:64] FLAG: --registry-burst="10" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485294 2546 flags.go:64] FLAG: --registry-qps="5" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485297 2546 flags.go:64] FLAG: --reserved-cpus="" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485300 2546 flags.go:64] FLAG: --reserved-memory="" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485304 2546 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485307 2546 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485310 2546 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485313 2546 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485315 2546 flags.go:64] FLAG: --runonce="false" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485318 2546 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485321 2546 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485324 2546 flags.go:64] FLAG: --seccomp-default="false" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485327 2546 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485330 2546 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485333 2546 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485336 2546 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485339 2546 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485341 2546 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485344 2546 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 21:08:39.489525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485347 2546 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485351 2546 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485354 2546 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485358 2546 flags.go:64] FLAG: --system-cgroups="" Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485360 2546 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485366 2546 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485369 2546 flags.go:64] FLAG: --tls-cert-file="" Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485372 2546 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485376 2546 flags.go:64] FLAG: --tls-min-version="" Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485379 2546 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485381 2546 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485384 2546 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485387 2546 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485390 2546 flags.go:64] FLAG: --v="2" Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485394 2546 flags.go:64] FLAG: --version="false" Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485403 2546 flags.go:64] FLAG: --vmodule="" Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485408 2546 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.485413 2546 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485511 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485515 2546 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485518 2546 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485521 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485524 2546 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:08:39.490160 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485527 2546 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485530 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485533 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485536 2546 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485538 2546 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485541 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485544 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485547 2546 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485550 2546 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485552 2546 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485555 2546 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485560 2546 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485563 2546 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485566 2546 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485568 2546 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485571 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485574 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485576 2546 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485579 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485581 2546 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:08:39.490729 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485584 2546 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485589 2546 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485592 2546 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485594 2546 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485597 2546 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485599 2546 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485604 2546 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485607 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485609 2546 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485613 2546 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485615 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485618 2546 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485621 2546 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485623 2546 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485626 2546 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485628 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485631 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485634 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485637 2546 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485639 2546 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:08:39.491255 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485660 2546 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485663 2546 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485665 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485668 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485671 2546 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485674 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485677 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485679 2546 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485682 2546 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485684 2546 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485688 2546 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485692 2546 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485695 2546 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485699 2546 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485702 2546 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485704 2546 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485707 2546 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485709 2546 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485713 2546 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:08:39.491778 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485716 2546 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485719 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485721 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485724 2546 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485726 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485729 2546 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485732 2546 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485734 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485737 2546 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485739 2546 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485741 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485744 2546 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485747 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485749 2546 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485752 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485754 2546 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485757 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485759 2546 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485762 2546 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485766 2546 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:08:39.492247 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485769 2546 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:08:39.492752 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.485772 2546 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:08:39.492752 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.486755 2546 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 21:08:39.493883 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.493861 2546 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 21:08:39.493923 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.493884 2546 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 21:08:39.493954 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493934 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:08:39.493954 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493939 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:08:39.493954 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493942 2546 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:08:39.493954 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493945 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:08:39.493954 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493948 2546 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:08:39.493954 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493950 2546 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:08:39.493954 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493953 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:08:39.493954 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493956 2546 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:08:39.493954 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493958 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493961 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493965 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493967 2546 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493971 2546 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493974 2546 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493976 2546 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493979 2546 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493982 2546 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493984 2546 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493987 2546 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493990 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493993 2546 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493995 2546 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.493999 2546 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494003 2546 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494006 2546 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494009 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494012 2546 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:08:39.494175 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494015 2546 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494018 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494021 2546 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494024 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494029 2546 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494033 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494036 2546 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494040 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494043 2546 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494045 2546 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494048 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494050 2546 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494053 2546 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494055 2546 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494058 2546 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494061 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494064 2546 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494067 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494069 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:08:39.494631 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494072 2546 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494075 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494078 2546 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494080 2546 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494083 2546 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494086 2546 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494088 2546 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494091 2546 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494093 2546 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494096 2546 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494098 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494101 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494103 2546 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494106 2546 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494108 2546 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494111 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494113 2546 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494116 2546 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494119 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494121 2546 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:08:39.495112 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494124 2546 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494126 2546 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494129 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494131 2546 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494134 2546 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494137 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494140 2546 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494142 2546 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494145 2546 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494148 2546 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494151 2546 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494153 2546 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494156 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494158 2546 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494161 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494163 2546 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494166 2546 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494168 2546 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494171 2546 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:08:39.495597 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494173 2546 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:08:39.496133 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.494179 2546 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 21:08:39.496133 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494281 2546 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:08:39.496133 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494286 2546 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:08:39.496133 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494289 2546 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:08:39.496133 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494292 2546 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:08:39.496133 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494294 2546 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:08:39.496133 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494297 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:08:39.496133 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494300 2546 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:08:39.496133 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494303 2546 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:08:39.496133 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494305 2546 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:08:39.496133 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494308 2546 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:08:39.496133 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494311 2546 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:08:39.496133 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494313 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:08:39.496133 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494316 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:08:39.496133 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494318 2546 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494321 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494324 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494327 2546 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494330 2546 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494332 2546 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494335 2546 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494338 2546 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494342 2546 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494346 2546 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494349 2546 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494351 2546 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494354 2546 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494356 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494359 2546 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494361 2546 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494364 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494367 2546 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494369 2546 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494372 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:08:39.496500 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494374 2546 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494376 2546 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494379 2546 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494382 2546 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494384 2546 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494387 2546 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494389 2546 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494392 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494394 2546 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494397 2546 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494401 2546 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494404 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494407 2546 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494410 2546 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494413 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494416 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494418 2546 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494421 2546 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494424 2546 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494427 2546 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:08:39.496994 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494429 2546 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494432 2546 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494434 2546 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494437 2546 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494440 2546 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494442 2546 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494444 2546 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494447 2546 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494449 2546 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494452 2546 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494455 2546 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494457 2546 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494460 2546 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494462 2546 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494465 2546 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494467 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494470 2546 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494472 2546 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494475 2546 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494477 2546 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:08:39.497503 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494480 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:08:39.498002 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494482 2546 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:08:39.498002 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494485 2546 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:08:39.498002 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494488 2546 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:08:39.498002 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494490 2546 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:08:39.498002 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494493 2546 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:08:39.498002 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494495 2546 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:08:39.498002 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494498 2546 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:08:39.498002 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494500 2546 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:08:39.498002 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494502 2546 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:08:39.498002 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494505 2546 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:08:39.498002 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494508 2546 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:08:39.498002 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:39.494510 2546 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:08:39.498002 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.494515 2546 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 21:08:39.498002 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.495443 2546 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 21:08:39.499239 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.499225 2546 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 21:08:39.500292 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.500279 2546 server.go:1019] "Starting client certificate rotation" Apr 22 21:08:39.500391 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.500376 2546 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 21:08:39.500428 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.500419 2546 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 21:08:39.527937 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.527918 2546 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 21:08:39.534095 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.534059 2546 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 21:08:39.551877 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.551855 2546 log.go:25] "Validated CRI v1 runtime API" Apr 22 21:08:39.558743 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.558720 2546 log.go:25] "Validated CRI v1 image API" Apr 22 21:08:39.560286 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.560270 2546 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 21:08:39.562893 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.562875 2546 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 21:08:39.565038 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.565017 2546 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 a191e931-8134-4a25-8ad4-7f539d17df46:/dev/nvme0n1p4 f5750272-0b4e-4c4b-ad9a-b0326dae3d26:/dev/nvme0n1p3] Apr 22 21:08:39.565103 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.565039 2546 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 21:08:39.574009 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.573886 2546 manager.go:217] Machine: {Timestamp:2026-04-22 21:08:39.56952445 +0000 UTC m=+0.478825156 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3169828 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24dd0ee3130509f76242c94a2d55f5 SystemUUID:ec24dd0e-e313-0509-f762-42c94a2d55f5 BootID:04e3b4b5-af92-4af3-a6b9-57881423ff42 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:aa:a5:04:74:4d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:aa:a5:04:74:4d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:72:8b:e6:7e:20:ba Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 21:08:39.574009 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.574006 2546 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 21:08:39.574117 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.574098 2546 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 21:08:39.575437 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.575395 2546 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 21:08:39.575593 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.575439 2546 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-20.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 21:08:39.575637 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.575602 2546 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 21:08:39.575637 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.575611 2546 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 21:08:39.575637 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.575625 2546 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 21:08:39.575637 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.575638 2546 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 21:08:39.576759 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.576749 2546 state_mem.go:36] "Initialized new in-memory state store" Apr 22 21:08:39.577046 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.577036 2546 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 21:08:39.579561 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.579551 2546 kubelet.go:491] "Attempting to sync node with API server" Apr 22 21:08:39.579596 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.579571 2546 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 21:08:39.579596 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.579585 2546 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 21:08:39.579596 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.579595 2546 kubelet.go:397] "Adding apiserver pod source" Apr 22 21:08:39.579721 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.579604 2546 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 21:08:39.580662 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.580638 2546 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 21:08:39.580706 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.580676 2546 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 21:08:39.584091 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.584075 2546 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 21:08:39.585640 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.585626 2546 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 21:08:39.587816 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.587799 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 21:08:39.587859 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.587830 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 21:08:39.587859 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.587843 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 21:08:39.587859 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.587855 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 21:08:39.587942 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.587866 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 21:08:39.587942 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.587877 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 21:08:39.587942 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.587888 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 21:08:39.587942 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.587900 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 21:08:39.587942 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.587912 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 21:08:39.587942 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.587924 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 21:08:39.588101 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.587952 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 21:08:39.588101 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.587972 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 21:08:39.588997 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.588986 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 21:08:39.589028 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.588999 2546 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 21:08:39.593008 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.592995 2546 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 21:08:39.593072 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.593040 2546 server.go:1295] "Started kubelet" Apr 22 21:08:39.593167 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.593119 2546 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 21:08:39.593241 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.593190 2546 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 21:08:39.593310 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.593270 2546 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 21:08:39.594866 ip-10-0-138-20 systemd[1]: Started Kubernetes Kubelet. Apr 22 21:08:39.596502 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.596483 2546 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 21:08:39.597033 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.597013 2546 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-20.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 21:08:39.597171 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:39.597030 2546 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-20.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 21:08:39.597243 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:39.597095 2546 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 21:08:39.597860 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.597845 2546 server.go:317] "Adding debug handlers to kubelet server" Apr 22 21:08:39.602665 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.602628 2546 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9dcdf" Apr 22 21:08:39.603256 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:39.602165 2546 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-20.ec2.internal.18a8c9f7e8883962 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-20.ec2.internal,UID:ip-10-0-138-20.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-20.ec2.internal,},FirstTimestamp:2026-04-22 21:08:39.593007458 +0000 UTC m=+0.502308159,LastTimestamp:2026-04-22 21:08:39.593007458 +0000 UTC m=+0.502308159,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-20.ec2.internal,}" Apr 22 21:08:39.604453 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:39.604437 2546 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 21:08:39.605974 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.605958 2546 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 21:08:39.606052 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.605978 2546 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 21:08:39.606691 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.606672 2546 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 21:08:39.606691 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.606674 2546 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 21:08:39.606847 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.606702 2546 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 21:08:39.606905 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:39.606889 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 22 21:08:39.607011 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.606947 2546 reconstruct.go:97] "Volume reconstruction finished" Apr 22 21:08:39.607011 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.606957 2546 reconciler.go:26] "Reconciler: start to sync state" Apr 22 21:08:39.607540 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.607520 2546 factory.go:55] Registering systemd factory Apr 22 21:08:39.607626 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.607552 2546 factory.go:223] Registration of the systemd container factory successfully Apr 22 21:08:39.607772 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.607760 2546 factory.go:153] Registering CRI-O factory Apr 22 21:08:39.607772 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.607773 2546 factory.go:223] Registration of the crio container factory successfully Apr 22 21:08:39.607873 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.607824 2546 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 21:08:39.607873 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.607851 2546 factory.go:103] Registering Raw factory Apr 22 21:08:39.607873 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.607866 2546 manager.go:1196] Started watching for new ooms in manager Apr 22 21:08:39.608236 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.608223 2546 manager.go:319] Starting recovery of all containers Apr 22 21:08:39.609776 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.609758 2546 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9dcdf" Apr 22 21:08:39.617471 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.617303 2546 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:08:39.619391 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.619375 2546 manager.go:324] Recovery completed Apr 22 21:08:39.620023 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:39.619987 2546 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-20.ec2.internal\" not found" node="ip-10-0-138-20.ec2.internal" Apr 22 21:08:39.623944 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.623927 2546 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:08:39.626731 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.626712 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:08:39.626828 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.626745 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:08:39.626828 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.626760 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:08:39.627701 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.627686 2546 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 21:08:39.627701 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.627698 2546 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 21:08:39.627811 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.627717 2546 state_mem.go:36] "Initialized new in-memory state store" Apr 22 21:08:39.630807 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.630796 2546 policy_none.go:49] "None policy: Start" Apr 22 21:08:39.630856 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.630812 2546 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 21:08:39.630856 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.630821 2546 state_mem.go:35] "Initializing new in-memory state store" Apr 22 21:08:39.673011 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.664972 2546 manager.go:341] "Starting Device Plugin manager" Apr 22 21:08:39.673011 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:39.665017 2546 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 21:08:39.673011 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.665030 2546 server.go:85] "Starting device plugin registration server" Apr 22 21:08:39.673011 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.665331 2546 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 21:08:39.673011 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.665341 2546 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 21:08:39.673011 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.665442 2546 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 21:08:39.673011 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.665517 2546 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 21:08:39.673011 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.665527 2546 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 21:08:39.673011 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:39.666170 2546 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 21:08:39.673011 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:39.666211 2546 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-20.ec2.internal\" not found" Apr 22 21:08:39.673879 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.673854 2546 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 21:08:39.675240 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.675216 2546 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 21:08:39.675341 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.675245 2546 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 21:08:39.675341 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.675264 2546 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 21:08:39.675341 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.675272 2546 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 21:08:39.675341 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:39.675308 2546 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 21:08:39.677524 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.677503 2546 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:08:39.765768 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.765684 2546 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:08:39.766874 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.766859 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:08:39.766947 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.766889 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:08:39.766947 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.766901 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:08:39.766947 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.766925 2546 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-20.ec2.internal" Apr 22 21:08:39.776220 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.776201 2546 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal"] Apr 22 21:08:39.776272 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.776260 2546 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:08:39.776559 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.776544 2546 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-20.ec2.internal" Apr 22 21:08:39.776596 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:39.776571 2546 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-20.ec2.internal\": node \"ip-10-0-138-20.ec2.internal\" not found" Apr 22 21:08:39.777068 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.777054 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:08:39.777134 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.777086 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:08:39.777134 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.777096 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:08:39.778778 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.778767 2546 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:08:39.778926 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.778912 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" Apr 22 21:08:39.778968 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.778951 2546 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:08:39.779432 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.779416 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:08:39.779491 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.779448 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:08:39.779491 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.779458 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:08:39.779562 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.779416 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:08:39.779562 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.779521 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:08:39.779562 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.779534 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:08:39.781197 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.781178 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal" Apr 22 21:08:39.781290 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.781203 2546 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:08:39.781869 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.781855 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:08:39.781964 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.781880 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:08:39.781964 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.781890 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:08:39.789520 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:39.789504 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 22 21:08:39.805420 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:39.805399 2546 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-20.ec2.internal\" not found" node="ip-10-0-138-20.ec2.internal" Apr 22 21:08:39.807935 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.807910 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/87b55359ad2fcf9ee78f0e3dd4c3711d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal\" (UID: \"87b55359ad2fcf9ee78f0e3dd4c3711d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" Apr 22 21:08:39.809800 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:39.809784 2546 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-20.ec2.internal\" not found" node="ip-10-0-138-20.ec2.internal" Apr 22 21:08:39.889967 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:39.889940 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 22 21:08:39.908232 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.908211 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/87b55359ad2fcf9ee78f0e3dd4c3711d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal\" (UID: \"87b55359ad2fcf9ee78f0e3dd4c3711d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" Apr 22 21:08:39.908329 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.908242 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87b55359ad2fcf9ee78f0e3dd4c3711d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal\" (UID: \"87b55359ad2fcf9ee78f0e3dd4c3711d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" Apr 22 21:08:39.908329 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.908262 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2219300bfb8a5cea9f09d55fabfc69ab-config\") pod \"kube-apiserver-proxy-ip-10-0-138-20.ec2.internal\" (UID: \"2219300bfb8a5cea9f09d55fabfc69ab\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal" Apr 22 21:08:39.908399 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:39.908324 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/87b55359ad2fcf9ee78f0e3dd4c3711d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal\" (UID: \"87b55359ad2fcf9ee78f0e3dd4c3711d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" Apr 22 21:08:39.990403 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:39.990356 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 22 21:08:40.008728 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.008707 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2219300bfb8a5cea9f09d55fabfc69ab-config\") pod \"kube-apiserver-proxy-ip-10-0-138-20.ec2.internal\" (UID: \"2219300bfb8a5cea9f09d55fabfc69ab\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal" Apr 22 21:08:40.008784 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.008735 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87b55359ad2fcf9ee78f0e3dd4c3711d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal\" (UID: \"87b55359ad2fcf9ee78f0e3dd4c3711d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" Apr 22 21:08:40.008784 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.008776 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87b55359ad2fcf9ee78f0e3dd4c3711d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal\" (UID: \"87b55359ad2fcf9ee78f0e3dd4c3711d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" Apr 22 21:08:40.008848 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.008802 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2219300bfb8a5cea9f09d55fabfc69ab-config\") pod \"kube-apiserver-proxy-ip-10-0-138-20.ec2.internal\" (UID: \"2219300bfb8a5cea9f09d55fabfc69ab\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal" Apr 22 21:08:40.091173 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:40.091144 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 22 21:08:40.107432 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.107406 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" Apr 22 21:08:40.113399 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.113382 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal" Apr 22 21:08:40.191750 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:40.191709 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 22 21:08:40.292228 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:40.292193 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 22 21:08:40.392804 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:40.392726 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 22 21:08:40.493245 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:40.493212 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 22 21:08:40.500379 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.500359 2546 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 21:08:40.500541 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.500520 2546 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 21:08:40.500583 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.500534 2546 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 21:08:40.594367 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:40.594328 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 22 21:08:40.607216 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.607188 2546 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 21:08:40.613336 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.613307 2546 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 21:03:39 +0000 UTC" deadline="2028-02-06 08:45:15.196891664 +0000 UTC" Apr 22 21:08:40.613336 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.613333 2546 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15707h36m34.583561932s" Apr 22 21:08:40.617847 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.617821 2546 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 21:08:40.619036 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:40.619011 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2219300bfb8a5cea9f09d55fabfc69ab.slice/crio-6ac3cc8613007f9e4cec463a8d0d78d8732b697bc9623ba0edfd7380ab80c0f6 WatchSource:0}: Error finding container 6ac3cc8613007f9e4cec463a8d0d78d8732b697bc9623ba0edfd7380ab80c0f6: Status 404 returned error can't find the container with id 6ac3cc8613007f9e4cec463a8d0d78d8732b697bc9623ba0edfd7380ab80c0f6 Apr 22 21:08:40.619782 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:40.619760 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87b55359ad2fcf9ee78f0e3dd4c3711d.slice/crio-497c6a898b4fb346e070ca724e8a514942bc92f5367b6539657d19616d82c843 WatchSource:0}: Error finding container 497c6a898b4fb346e070ca724e8a514942bc92f5367b6539657d19616d82c843: Status 404 returned error can't find the container with id 497c6a898b4fb346e070ca724e8a514942bc92f5367b6539657d19616d82c843 Apr 22 21:08:40.623449 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.623426 2546 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:08:40.652055 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.651997 2546 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-qj9s4" Apr 22 21:08:40.657551 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.657535 2546 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-qj9s4" Apr 22 21:08:40.670018 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.669991 2546 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:08:40.678703 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.678661 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" event={"ID":"87b55359ad2fcf9ee78f0e3dd4c3711d","Type":"ContainerStarted","Data":"497c6a898b4fb346e070ca724e8a514942bc92f5367b6539657d19616d82c843"} Apr 22 21:08:40.679635 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:40.679615 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal" event={"ID":"2219300bfb8a5cea9f09d55fabfc69ab","Type":"ContainerStarted","Data":"6ac3cc8613007f9e4cec463a8d0d78d8732b697bc9623ba0edfd7380ab80c0f6"} Apr 22 21:08:40.694800 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:40.694769 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 22 21:08:40.795344 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:40.795292 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 22 21:08:40.895891 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:40.895850 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 22 21:08:40.996420 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:40.996340 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 22 21:08:41.097252 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:41.097217 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 22 21:08:41.198121 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:41.198086 2546 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 22 21:08:41.199577 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.199548 2546 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:08:41.206409 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.206387 2546 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" Apr 22 21:08:41.218553 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.218428 2546 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 21:08:41.219611 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.219584 2546 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal" Apr 22 21:08:41.226775 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.226752 2546 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 21:08:41.581083 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.581049 2546 apiserver.go:52] "Watching apiserver" Apr 22 21:08:41.583496 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.583470 2546 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:08:41.586767 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.586749 2546 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 21:08:41.589372 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.589350 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-4476p","openshift-network-diagnostics/network-check-target-hm4qp","openshift-network-operator/iptables-alerter-f6hmc","openshift-ovn-kubernetes/ovnkube-node-6flqf","kube-system/konnectivity-agent-l25nv","openshift-cluster-node-tuning-operator/tuned-dgcb8","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal","openshift-multus/multus-additional-cni-plugins-v5b9n","kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd","openshift-image-registry/node-ca-zbnft","openshift-multus/multus-tdtws"] Apr 22 21:08:41.592025 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.592003 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f6hmc" Apr 22 21:08:41.594013 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.593991 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:08:41.594121 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.594062 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 21:08:41.594189 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.594126 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 21:08:41.594189 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.594150 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dj76g\"" Apr 22 21:08:41.594699 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.594545 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.596906 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.596725 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 21:08:41.596906 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.596749 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l25nv" Apr 22 21:08:41.596906 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.596798 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 21:08:41.596906 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.596878 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 21:08:41.597131 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.597056 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 21:08:41.597131 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.597107 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 21:08:41.597131 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.597127 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 21:08:41.597366 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.597349 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pm9kf\"" Apr 22 21:08:41.598529 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.598494 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 21:08:41.599098 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.599081 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-kzp5w\"" Apr 22 21:08:41.599399 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.599383 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 21:08:41.601346 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.601197 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.601346 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.601327 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.603242 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.603226 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 21:08:41.603370 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.603333 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-pzc7r\"" Apr 22 21:08:41.603491 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.603470 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 21:08:41.603986 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.603968 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:08:41.604153 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.604141 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 21:08:41.604411 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.604399 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 21:08:41.604724 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.604711 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 21:08:41.604914 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.604896 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-m42kf\"" Apr 22 21:08:41.605331 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.605312 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 21:08:41.607667 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.607470 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.607667 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.607594 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:41.608029 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:41.607984 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4476p" podUID="046a767f-6fe0-4c57-ba82-6a2ce75f7864" Apr 22 21:08:41.610112 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.610003 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 21:08:41.610241 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.610222 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-tkx2v\"" Apr 22 21:08:41.610618 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.610597 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:08:41.610725 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:41.610685 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hm4qp" podUID="dc5617e8-2aaf-4671-9fab-db6988f9b8b4" Apr 22 21:08:41.613138 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.613118 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.615178 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.615157 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 21:08:41.615267 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.615237 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 21:08:41.615374 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.615345 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 21:08:41.615450 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.615433 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-q7rq8\"" Apr 22 21:08:41.615572 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.615544 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zbnft" Apr 22 21:08:41.617670 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.617627 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 21:08:41.617775 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.617753 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fndn8\"" Apr 22 21:08:41.617879 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.617859 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-etc-openvswitch\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.617942 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.617895 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-host\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.617942 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.617922 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-multus-daemon-config\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.618045 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.617950 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.618045 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.617978 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 21:08:41.618045 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.617980 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 21:08:41.618045 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618014 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.618228 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618047 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-etc-sysctl-d\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.618228 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618062 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-cni-binary-copy\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.618228 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618105 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d-host-slash\") pod \"iptables-alerter-f6hmc\" (UID: \"e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d\") " pod="openshift-network-operator/iptables-alerter-f6hmc" Apr 22 21:08:41.618228 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618119 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs\") pod \"network-metrics-daemon-4476p\" (UID: \"046a767f-6fe0-4c57-ba82-6a2ce75f7864\") " pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:41.618228 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618134 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-png8z\" (UniqueName: \"kubernetes.io/projected/046a767f-6fe0-4c57-ba82-6a2ce75f7864-kube-api-access-png8z\") pod \"network-metrics-daemon-4476p\" (UID: \"046a767f-6fe0-4c57-ba82-6a2ce75f7864\") " pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:41.618228 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618171 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/195f14c0-2e98-4e09-8198-0f6696d61fc6-env-overrides\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.618228 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618191 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-var-lib-kubelet\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.618228 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618213 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-host-var-lib-kubelet\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.618505 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618249 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-hostroot\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.618505 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618272 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-log-socket\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.618505 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618322 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ea7b080b-6583-467a-8398-4eb854c288f4-etc-tuned\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.618505 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618343 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dglb4\" (UniqueName: \"kubernetes.io/projected/ea7b080b-6583-467a-8398-4eb854c288f4-kube-api-access-dglb4\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.618505 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618366 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-multus-cni-dir\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.618505 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618401 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rj59\" (UniqueName: \"kubernetes.io/projected/195f14c0-2e98-4e09-8198-0f6696d61fc6-kube-api-access-4rj59\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.618505 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618453 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-etc-sysctl-conf\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.618505 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618476 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/195f14c0-2e98-4e09-8198-0f6696d61fc6-ovnkube-config\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.618505 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618491 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-etc-modprobe-d\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.618505 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618504 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-cnibin\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.618894 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618526 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-os-release\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.618894 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618546 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h27xx\" (UniqueName: \"kubernetes.io/projected/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-kube-api-access-h27xx\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.618894 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618583 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-os-release\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.618894 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618669 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-slash\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.618894 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618702 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/195f14c0-2e98-4e09-8198-0f6696d61fc6-ovnkube-script-lib\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.618894 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618731 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-host-var-lib-cni-bin\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.618894 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618756 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-host-var-lib-cni-multus\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.618894 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618781 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-cnibin\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.618894 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618804 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-kubelet\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.618894 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618827 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-node-log\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.618894 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618850 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/195f14c0-2e98-4e09-8198-0f6696d61fc6-ovn-node-metrics-cert\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.618894 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618873 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a782077c-00c8-4baa-8f9c-ce021c1c5d9c-konnectivity-ca\") pod \"konnectivity-agent-l25nv\" (UID: \"a782077c-00c8-4baa-8f9c-ce021c1c5d9c\") " pod="kube-system/konnectivity-agent-l25nv" Apr 22 21:08:41.618894 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618896 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-etc-systemd\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.619362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618919 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-host-run-multus-certs\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.619362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618968 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-run-openvswitch\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.619362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.618990 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-run-ovn\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.619362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619019 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-sys\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.619362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619043 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.619362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619068 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.619362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619092 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-run-ovn-kubernetes\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.619362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619114 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-run\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.619362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619136 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-lib-modules\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.619362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619160 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-multus-conf-dir\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.619362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619183 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-run-netns\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.619362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619218 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-etc-sysconfig\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.619362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619246 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-multus-socket-dir-parent\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.619362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619268 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-host-run-netns\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.619362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619304 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzl7r\" (UniqueName: \"kubernetes.io/projected/e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d-kube-api-access-jzl7r\") pod \"iptables-alerter-f6hmc\" (UID: \"e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d\") " pod="openshift-network-operator/iptables-alerter-f6hmc" Apr 22 21:08:41.619362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619337 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knlhc\" (UniqueName: \"kubernetes.io/projected/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-kube-api-access-knlhc\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.619953 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619383 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-run-systemd\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.619953 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619407 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-cni-netd\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.619953 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619447 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-cni-binary-copy\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.619953 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619479 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-systemd-units\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.619953 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619505 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-etc-kubernetes\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.619953 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619524 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-host-run-k8s-cni-cncf-io\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.619953 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619546 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-system-cni-dir\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.619953 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619561 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-var-lib-openvswitch\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.619953 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619578 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-cni-bin\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.619953 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619637 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ea7b080b-6583-467a-8398-4eb854c288f4-tmp\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.619953 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619703 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-system-cni-dir\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.619953 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619722 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d-iptables-alerter-script\") pod \"iptables-alerter-f6hmc\" (UID: \"e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d\") " pod="openshift-network-operator/iptables-alerter-f6hmc" Apr 22 21:08:41.619953 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619746 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a782077c-00c8-4baa-8f9c-ce021c1c5d9c-agent-certs\") pod \"konnectivity-agent-l25nv\" (UID: \"a782077c-00c8-4baa-8f9c-ce021c1c5d9c\") " pod="kube-system/konnectivity-agent-l25nv" Apr 22 21:08:41.619953 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.619768 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-etc-kubernetes\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.658505 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.658469 2546 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 21:03:40 +0000 UTC" deadline="2027-12-02 19:52:31.336116126 +0000 UTC" Apr 22 21:08:41.658505 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.658496 2546 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14134h43m49.677622572s" Apr 22 21:08:41.683309 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.683288 2546 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:08:41.707636 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.707610 2546 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 21:08:41.720237 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720198 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a782077c-00c8-4baa-8f9c-ce021c1c5d9c-konnectivity-ca\") pod \"konnectivity-agent-l25nv\" (UID: \"a782077c-00c8-4baa-8f9c-ce021c1c5d9c\") " pod="kube-system/konnectivity-agent-l25nv" Apr 22 21:08:41.720371 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720247 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-etc-systemd\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.720371 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720265 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-host-run-multus-certs\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.720371 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720285 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ae026b3-22f4-47d0-a136-0a0697546e33-etc-selinux\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.720371 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720308 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-run-openvswitch\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.720580 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720363 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-host-run-multus-certs\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.720580 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720388 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-etc-systemd\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.720580 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720417 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-run-ovn\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.720580 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720425 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-run-openvswitch\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.720580 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720449 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-sys\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.720580 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720477 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.720580 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720481 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-sys\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.720580 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720451 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-run-ovn\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.720580 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720504 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.720580 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720534 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ae026b3-22f4-47d0-a136-0a0697546e33-sys-fs\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.720580 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720559 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-run-ovn-kubernetes\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.720580 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720582 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-run\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.721125 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720605 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-lib-modules\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.721125 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720629 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt28q\" (UniqueName: \"kubernetes.io/projected/3ae026b3-22f4-47d0-a136-0a0697546e33-kube-api-access-mt28q\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.721125 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720684 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-multus-conf-dir\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.721125 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720706 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-run-netns\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.721125 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720721 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-etc-sysconfig\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.721125 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720737 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-multus-socket-dir-parent\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.721125 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720761 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-host-run-netns\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.721125 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720786 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jzl7r\" (UniqueName: \"kubernetes.io/projected/e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d-kube-api-access-jzl7r\") pod \"iptables-alerter-f6hmc\" (UID: \"e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d\") " pod="openshift-network-operator/iptables-alerter-f6hmc" Apr 22 21:08:41.721125 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720811 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knlhc\" (UniqueName: \"kubernetes.io/projected/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-kube-api-access-knlhc\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.721125 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720836 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-run-systemd\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.721125 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720854 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-cni-netd\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.721125 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720880 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-cni-binary-copy\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.721125 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720908 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ae026b3-22f4-47d0-a136-0a0697546e33-device-dir\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.721125 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720949 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ee933643-0139-4bfd-9d0d-903d3404df30-serviceca\") pod \"node-ca-zbnft\" (UID: \"ee933643-0139-4bfd-9d0d-903d3404df30\") " pod="openshift-image-registry/node-ca-zbnft" Apr 22 21:08:41.721125 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720976 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-systemd-units\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.721125 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721010 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-etc-kubernetes\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.721125 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721036 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-host-run-k8s-cni-cncf-io\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.721887 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721054 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-system-cni-dir\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.721887 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721077 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzdmh\" (UniqueName: \"kubernetes.io/projected/ee933643-0139-4bfd-9d0d-903d3404df30-kube-api-access-nzdmh\") pod \"node-ca-zbnft\" (UID: \"ee933643-0139-4bfd-9d0d-903d3404df30\") " pod="openshift-image-registry/node-ca-zbnft" Apr 22 21:08:41.721887 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721094 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-var-lib-openvswitch\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.721887 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721108 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-cni-bin\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.721887 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721124 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ea7b080b-6583-467a-8398-4eb854c288f4-tmp\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.721887 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721143 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.721887 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721149 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-system-cni-dir\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.721887 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721210 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-system-cni-dir\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.721887 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721213 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d-iptables-alerter-script\") pod \"iptables-alerter-f6hmc\" (UID: \"e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d\") " pod="openshift-network-operator/iptables-alerter-f6hmc" Apr 22 21:08:41.721887 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721249 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a782077c-00c8-4baa-8f9c-ce021c1c5d9c-agent-certs\") pod \"konnectivity-agent-l25nv\" (UID: \"a782077c-00c8-4baa-8f9c-ce021c1c5d9c\") " pod="kube-system/konnectivity-agent-l25nv" Apr 22 21:08:41.721887 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721295 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-etc-kubernetes\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.721887 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721296 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.721887 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721323 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee933643-0139-4bfd-9d0d-903d3404df30-host\") pod \"node-ca-zbnft\" (UID: \"ee933643-0139-4bfd-9d0d-903d3404df30\") " pod="openshift-image-registry/node-ca-zbnft" Apr 22 21:08:41.721887 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721377 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-etc-sysconfig\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.721887 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.720878 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a782077c-00c8-4baa-8f9c-ce021c1c5d9c-konnectivity-ca\") pod \"konnectivity-agent-l25nv\" (UID: \"a782077c-00c8-4baa-8f9c-ce021c1c5d9c\") " pod="kube-system/konnectivity-agent-l25nv" Apr 22 21:08:41.721887 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721486 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-run-ovn-kubernetes\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.721887 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721611 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-lib-modules\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.722616 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721684 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-multus-conf-dir\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.722616 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721706 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d-iptables-alerter-script\") pod \"iptables-alerter-f6hmc\" (UID: \"e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d\") " pod="openshift-network-operator/iptables-alerter-f6hmc" Apr 22 21:08:41.722616 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721719 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-etc-openvswitch\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.722616 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721746 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-host\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.722616 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721753 2546 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 21:08:41.722616 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721811 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-host-run-k8s-cni-cncf-io\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.722616 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721817 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-multus-socket-dir-parent\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.722616 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721775 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-etc-kubernetes\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.722616 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721861 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-host-run-netns\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.722616 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721883 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-system-cni-dir\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.722616 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721920 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-var-lib-openvswitch\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.722616 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721953 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-cni-bin\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.722616 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721985 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-run-netns\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.722616 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722074 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-etc-openvswitch\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.722616 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722133 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-host\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.722616 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722175 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-etc-kubernetes\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.722616 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722249 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-run\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.722616 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722275 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-cni-binary-copy\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.723499 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722324 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-cni-netd\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.723499 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722359 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-systemd-units\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.723499 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722466 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-run-systemd\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.723499 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.721774 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-multus-daemon-config\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.723499 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722663 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.723499 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722699 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.723499 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722728 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-etc-sysctl-d\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.723499 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722755 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-cni-binary-copy\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.723499 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722779 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d-host-slash\") pod \"iptables-alerter-f6hmc\" (UID: \"e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d\") " pod="openshift-network-operator/iptables-alerter-f6hmc" Apr 22 21:08:41.723499 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722806 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs\") pod \"network-metrics-daemon-4476p\" (UID: \"046a767f-6fe0-4c57-ba82-6a2ce75f7864\") " pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:41.723499 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722833 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-png8z\" (UniqueName: \"kubernetes.io/projected/046a767f-6fe0-4c57-ba82-6a2ce75f7864-kube-api-access-png8z\") pod \"network-metrics-daemon-4476p\" (UID: \"046a767f-6fe0-4c57-ba82-6a2ce75f7864\") " pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:41.723499 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722835 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.723499 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722861 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ae026b3-22f4-47d0-a136-0a0697546e33-socket-dir\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.723499 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722887 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-multus-daemon-config\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.723499 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722891 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/195f14c0-2e98-4e09-8198-0f6696d61fc6-env-overrides\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.723499 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722958 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d-host-slash\") pod \"iptables-alerter-f6hmc\" (UID: \"e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d\") " pod="openshift-network-operator/iptables-alerter-f6hmc" Apr 22 21:08:41.723499 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.722956 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-var-lib-kubelet\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.724308 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723003 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-host-var-lib-kubelet\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.724308 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723016 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-var-lib-kubelet\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.724308 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723028 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-hostroot\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.724308 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723060 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-log-socket\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.724308 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723064 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-hostroot\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.724308 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723084 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ea7b080b-6583-467a-8398-4eb854c288f4-etc-tuned\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.724308 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723109 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dglb4\" (UniqueName: \"kubernetes.io/projected/ea7b080b-6583-467a-8398-4eb854c288f4-kube-api-access-dglb4\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.724308 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723133 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-multus-cni-dir\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.724308 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:41.723158 2546 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:41.724308 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:41.723254 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs podName:046a767f-6fe0-4c57-ba82-6a2ce75f7864 nodeName:}" failed. No retries permitted until 2026-04-22 21:08:42.223207989 +0000 UTC m=+3.132508680 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs") pod "network-metrics-daemon-4476p" (UID: "046a767f-6fe0-4c57-ba82-6a2ce75f7864") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:41.724308 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723556 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-host-var-lib-kubelet\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.724308 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723158 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rj59\" (UniqueName: \"kubernetes.io/projected/195f14c0-2e98-4e09-8198-0f6696d61fc6-kube-api-access-4rj59\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.724308 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723601 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-etc-sysctl-conf\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.724308 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723676 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x98kt\" (UniqueName: \"kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt\") pod \"network-check-target-hm4qp\" (UID: \"dc5617e8-2aaf-4671-9fab-db6988f9b8b4\") " pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:08:41.724308 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723706 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/195f14c0-2e98-4e09-8198-0f6696d61fc6-ovnkube-config\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.724308 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723732 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-etc-modprobe-d\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.724308 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723781 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-cnibin\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.725073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723808 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ae026b3-22f4-47d0-a136-0a0697546e33-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.725073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723837 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-os-release\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.725073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723862 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h27xx\" (UniqueName: \"kubernetes.io/projected/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-kube-api-access-h27xx\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.725073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723886 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-os-release\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.725073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723912 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ae026b3-22f4-47d0-a136-0a0697546e33-registration-dir\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.725073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723939 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-slash\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.725073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723958 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-cni-binary-copy\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.725073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723965 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/195f14c0-2e98-4e09-8198-0f6696d61fc6-ovnkube-script-lib\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.725073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.723991 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-host-var-lib-cni-bin\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.725073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.724015 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-host-var-lib-cni-multus\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.725073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.724021 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.725073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.724043 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-cnibin\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.725073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.724085 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-cnibin\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.725073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.724192 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-etc-sysctl-d\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.725073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.724226 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-kubelet\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.725073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.724401 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/195f14c0-2e98-4e09-8198-0f6696d61fc6-env-overrides\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.725073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.724383 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-etc-sysctl-conf\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.725870 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.725210 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-os-release\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.725870 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.725330 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ea7b080b-6583-467a-8398-4eb854c288f4-etc-modprobe-d\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.725870 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.725406 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-cnibin\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.725870 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.725478 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-os-release\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.725870 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.725752 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-host-var-lib-cni-multus\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.725870 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.725804 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-host-var-lib-cni-bin\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.725870 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.725860 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-slash\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.725870 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.724085 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-host-kubelet\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.726262 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.725900 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-node-log\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.726262 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.725927 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/195f14c0-2e98-4e09-8198-0f6696d61fc6-ovn-node-metrics-cert\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.726262 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.725939 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/195f14c0-2e98-4e09-8198-0f6696d61fc6-ovnkube-script-lib\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.726262 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.725985 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a782077c-00c8-4baa-8f9c-ce021c1c5d9c-agent-certs\") pod \"konnectivity-agent-l25nv\" (UID: \"a782077c-00c8-4baa-8f9c-ce021c1c5d9c\") " pod="kube-system/konnectivity-agent-l25nv" Apr 22 21:08:41.726262 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.726014 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-multus-cni-dir\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.726262 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.726047 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-node-log\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.726262 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.726055 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/195f14c0-2e98-4e09-8198-0f6696d61fc6-log-socket\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.726262 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.726183 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ea7b080b-6583-467a-8398-4eb854c288f4-tmp\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.726578 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.726347 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/195f14c0-2e98-4e09-8198-0f6696d61fc6-ovnkube-config\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.727206 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.727185 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ea7b080b-6583-467a-8398-4eb854c288f4-etc-tuned\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.728298 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.728249 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/195f14c0-2e98-4e09-8198-0f6696d61fc6-ovn-node-metrics-cert\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.731439 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.731415 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knlhc\" (UniqueName: \"kubernetes.io/projected/0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63-kube-api-access-knlhc\") pod \"multus-additional-cni-plugins-v5b9n\" (UID: \"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63\") " pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.732660 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.732615 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzl7r\" (UniqueName: \"kubernetes.io/projected/e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d-kube-api-access-jzl7r\") pod \"iptables-alerter-f6hmc\" (UID: \"e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d\") " pod="openshift-network-operator/iptables-alerter-f6hmc" Apr 22 21:08:41.732799 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.732777 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rj59\" (UniqueName: \"kubernetes.io/projected/195f14c0-2e98-4e09-8198-0f6696d61fc6-kube-api-access-4rj59\") pod \"ovnkube-node-6flqf\" (UID: \"195f14c0-2e98-4e09-8198-0f6696d61fc6\") " pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.733173 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.733149 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h27xx\" (UniqueName: \"kubernetes.io/projected/aab52e3e-8c7f-4bd0-92a2-9f6136daf78d-kube-api-access-h27xx\") pod \"multus-tdtws\" (UID: \"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d\") " pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.733587 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.733571 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-png8z\" (UniqueName: \"kubernetes.io/projected/046a767f-6fe0-4c57-ba82-6a2ce75f7864-kube-api-access-png8z\") pod \"network-metrics-daemon-4476p\" (UID: \"046a767f-6fe0-4c57-ba82-6a2ce75f7864\") " pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:41.733807 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.733788 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dglb4\" (UniqueName: \"kubernetes.io/projected/ea7b080b-6583-467a-8398-4eb854c288f4-kube-api-access-dglb4\") pod \"tuned-dgcb8\" (UID: \"ea7b080b-6583-467a-8398-4eb854c288f4\") " pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.826362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.826330 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x98kt\" (UniqueName: \"kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt\") pod \"network-check-target-hm4qp\" (UID: \"dc5617e8-2aaf-4671-9fab-db6988f9b8b4\") " pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:08:41.826541 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.826368 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ae026b3-22f4-47d0-a136-0a0697546e33-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.826541 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.826388 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ae026b3-22f4-47d0-a136-0a0697546e33-registration-dir\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.826541 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.826407 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ae026b3-22f4-47d0-a136-0a0697546e33-etc-selinux\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.826541 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.826429 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ae026b3-22f4-47d0-a136-0a0697546e33-sys-fs\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.826541 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.826496 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ae026b3-22f4-47d0-a136-0a0697546e33-sys-fs\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.826541 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.826496 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ae026b3-22f4-47d0-a136-0a0697546e33-registration-dir\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.826541 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.826498 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt28q\" (UniqueName: \"kubernetes.io/projected/3ae026b3-22f4-47d0-a136-0a0697546e33-kube-api-access-mt28q\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.826541 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.826500 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ae026b3-22f4-47d0-a136-0a0697546e33-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.826925 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.826547 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ae026b3-22f4-47d0-a136-0a0697546e33-device-dir\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.826925 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.826565 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ee933643-0139-4bfd-9d0d-903d3404df30-serviceca\") pod \"node-ca-zbnft\" (UID: \"ee933643-0139-4bfd-9d0d-903d3404df30\") " pod="openshift-image-registry/node-ca-zbnft" Apr 22 21:08:41.826925 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.826560 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ae026b3-22f4-47d0-a136-0a0697546e33-etc-selinux\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.826925 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.826603 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ae026b3-22f4-47d0-a136-0a0697546e33-device-dir\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.826925 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.826666 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzdmh\" (UniqueName: \"kubernetes.io/projected/ee933643-0139-4bfd-9d0d-903d3404df30-kube-api-access-nzdmh\") pod \"node-ca-zbnft\" (UID: \"ee933643-0139-4bfd-9d0d-903d3404df30\") " pod="openshift-image-registry/node-ca-zbnft" Apr 22 21:08:41.826925 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.826707 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee933643-0139-4bfd-9d0d-903d3404df30-host\") pod \"node-ca-zbnft\" (UID: \"ee933643-0139-4bfd-9d0d-903d3404df30\") " pod="openshift-image-registry/node-ca-zbnft" Apr 22 21:08:41.826925 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.826769 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee933643-0139-4bfd-9d0d-903d3404df30-host\") pod \"node-ca-zbnft\" (UID: \"ee933643-0139-4bfd-9d0d-903d3404df30\") " pod="openshift-image-registry/node-ca-zbnft" Apr 22 21:08:41.826925 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.826866 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ae026b3-22f4-47d0-a136-0a0697546e33-socket-dir\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.827236 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.826990 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ae026b3-22f4-47d0-a136-0a0697546e33-socket-dir\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.827236 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.827035 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ee933643-0139-4bfd-9d0d-903d3404df30-serviceca\") pod \"node-ca-zbnft\" (UID: \"ee933643-0139-4bfd-9d0d-903d3404df30\") " pod="openshift-image-registry/node-ca-zbnft" Apr 22 21:08:41.832346 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:41.832231 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:08:41.832346 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:41.832258 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:08:41.832346 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:41.832271 2546 projected.go:194] Error preparing data for projected volume kube-api-access-x98kt for pod openshift-network-diagnostics/network-check-target-hm4qp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:41.832542 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:41.832374 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt podName:dc5617e8-2aaf-4671-9fab-db6988f9b8b4 nodeName:}" failed. No retries permitted until 2026-04-22 21:08:42.332355397 +0000 UTC m=+3.241656086 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-x98kt" (UniqueName: "kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt") pod "network-check-target-hm4qp" (UID: "dc5617e8-2aaf-4671-9fab-db6988f9b8b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:41.834961 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.834937 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt28q\" (UniqueName: \"kubernetes.io/projected/3ae026b3-22f4-47d0-a136-0a0697546e33-kube-api-access-mt28q\") pod \"aws-ebs-csi-driver-node-2lljd\" (UID: \"3ae026b3-22f4-47d0-a136-0a0697546e33\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.835787 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.835768 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzdmh\" (UniqueName: \"kubernetes.io/projected/ee933643-0139-4bfd-9d0d-903d3404df30-kube-api-access-nzdmh\") pod \"node-ca-zbnft\" (UID: \"ee933643-0139-4bfd-9d0d-903d3404df30\") " pod="openshift-image-registry/node-ca-zbnft" Apr 22 21:08:41.905331 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.905296 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f6hmc" Apr 22 21:08:41.913150 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.913130 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:08:41.924756 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.924731 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l25nv" Apr 22 21:08:41.930459 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.930440 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" Apr 22 21:08:41.936994 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.936975 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-v5b9n" Apr 22 21:08:41.943516 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.943496 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tdtws" Apr 22 21:08:41.950094 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.950078 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" Apr 22 21:08:41.955740 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:41.955717 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zbnft" Apr 22 21:08:42.229896 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.229812 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs\") pod \"network-metrics-daemon-4476p\" (UID: \"046a767f-6fe0-4c57-ba82-6a2ce75f7864\") " pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:42.230052 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:42.230000 2546 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:42.230111 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:42.230082 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs podName:046a767f-6fe0-4c57-ba82-6a2ce75f7864 nodeName:}" failed. No retries permitted until 2026-04-22 21:08:43.23005516 +0000 UTC m=+4.139355854 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs") pod "network-metrics-daemon-4476p" (UID: "046a767f-6fe0-4c57-ba82-6a2ce75f7864") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:42.327366 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:42.327334 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda782077c_00c8_4baa_8f9c_ce021c1c5d9c.slice/crio-ab9e7996e13e52dd79c5eebef2015a1eb4bee1085b0e45ed72c8010dfc583e44 WatchSource:0}: Error finding container ab9e7996e13e52dd79c5eebef2015a1eb4bee1085b0e45ed72c8010dfc583e44: Status 404 returned error can't find the container with id ab9e7996e13e52dd79c5eebef2015a1eb4bee1085b0e45ed72c8010dfc583e44 Apr 22 21:08:42.328080 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:42.328054 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee933643_0139_4bfd_9d0d_903d3404df30.slice/crio-d3b7856b2f1e54c944ba11dcd585b94cf0a7959578e03666720c84d4d271214f WatchSource:0}: Error finding container d3b7856b2f1e54c944ba11dcd585b94cf0a7959578e03666720c84d4d271214f: Status 404 returned error can't find the container with id d3b7856b2f1e54c944ba11dcd585b94cf0a7959578e03666720c84d4d271214f Apr 22 21:08:42.331708 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:42.331683 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaab52e3e_8c7f_4bd0_92a2_9f6136daf78d.slice/crio-fe1bb3f40e710468efb037ba5d1f4bb4ee1320c240221321ff707e5d185f0b2c WatchSource:0}: Error finding container fe1bb3f40e710468efb037ba5d1f4bb4ee1320c240221321ff707e5d185f0b2c: Status 404 returned error can't find the container with id fe1bb3f40e710468efb037ba5d1f4bb4ee1320c240221321ff707e5d185f0b2c Apr 22 21:08:42.339408 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:08:42.338402 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod195f14c0_2e98_4e09_8198_0f6696d61fc6.slice/crio-bf1b6c3531318c625723bc49ced463914bc68485ed3352c28f40584e36f9f9a2 WatchSource:0}: Error finding container bf1b6c3531318c625723bc49ced463914bc68485ed3352c28f40584e36f9f9a2: Status 404 returned error can't find the container with id bf1b6c3531318c625723bc49ced463914bc68485ed3352c28f40584e36f9f9a2 Apr 22 21:08:42.432571 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.432407 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x98kt\" (UniqueName: \"kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt\") pod \"network-check-target-hm4qp\" (UID: \"dc5617e8-2aaf-4671-9fab-db6988f9b8b4\") " pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:08:42.432700 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:42.432540 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:08:42.432700 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:42.432628 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:08:42.432700 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:42.432661 2546 projected.go:194] Error preparing data for projected volume kube-api-access-x98kt for pod openshift-network-diagnostics/network-check-target-hm4qp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:42.432798 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:42.432721 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt podName:dc5617e8-2aaf-4671-9fab-db6988f9b8b4 nodeName:}" failed. No retries permitted until 2026-04-22 21:08:43.432703127 +0000 UTC m=+4.342003816 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-x98kt" (UniqueName: "kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt") pod "network-check-target-hm4qp" (UID: "dc5617e8-2aaf-4671-9fab-db6988f9b8b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:42.658680 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.658619 2546 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 21:03:40 +0000 UTC" deadline="2028-01-26 15:53:31.63572463 +0000 UTC" Apr 22 21:08:42.658680 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.658671 2546 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15450h44m48.977057495s" Apr 22 21:08:42.670462 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.670433 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-flkgv"] Apr 22 21:08:42.673321 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.673298 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:42.673446 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:42.673365 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-flkgv" podUID="e49b46d0-3a94-438f-bdc2-b93ad7f84cde" Apr 22 21:08:42.683177 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.683143 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tdtws" event={"ID":"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d","Type":"ContainerStarted","Data":"fe1bb3f40e710468efb037ba5d1f4bb4ee1320c240221321ff707e5d185f0b2c"} Apr 22 21:08:42.684358 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.684335 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zbnft" event={"ID":"ee933643-0139-4bfd-9d0d-903d3404df30","Type":"ContainerStarted","Data":"d3b7856b2f1e54c944ba11dcd585b94cf0a7959578e03666720c84d4d271214f"} Apr 22 21:08:42.687084 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.687032 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l25nv" event={"ID":"a782077c-00c8-4baa-8f9c-ce021c1c5d9c","Type":"ContainerStarted","Data":"ab9e7996e13e52dd79c5eebef2015a1eb4bee1085b0e45ed72c8010dfc583e44"} Apr 22 21:08:42.689211 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.689174 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" event={"ID":"ea7b080b-6583-467a-8398-4eb854c288f4","Type":"ContainerStarted","Data":"b91ed345cb2be36df36d143236bd6392601c3bc62381bbd2f7c0f993b5370ec0"} Apr 22 21:08:42.691019 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.690994 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal" event={"ID":"2219300bfb8a5cea9f09d55fabfc69ab","Type":"ContainerStarted","Data":"86971e72fac6a10baa337ad92df654a45b8eadc49f261733bb2bdaf7a4aa90d2"} Apr 22 21:08:42.693910 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.693878 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" event={"ID":"195f14c0-2e98-4e09-8198-0f6696d61fc6","Type":"ContainerStarted","Data":"bf1b6c3531318c625723bc49ced463914bc68485ed3352c28f40584e36f9f9a2"} Apr 22 21:08:42.695467 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.695445 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f6hmc" event={"ID":"e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d","Type":"ContainerStarted","Data":"e73a4bca37295cc70f3fc628c4049c20b0ae57cd3cc3cdec3d35e64d9a47dd6e"} Apr 22 21:08:42.697915 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.697873 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5b9n" event={"ID":"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63","Type":"ContainerStarted","Data":"a9ade195b3d4e328f6a26db9634bf3251c32236486276d4e7283ec041b254059"} Apr 22 21:08:42.700106 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.700081 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" event={"ID":"3ae026b3-22f4-47d0-a136-0a0697546e33","Type":"ContainerStarted","Data":"91f67881ead9c65ea0a9491b39c2f665fab90f934a5cdebd30a2a7ce3eec4549"} Apr 22 21:08:42.709249 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.709200 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal" podStartSLOduration=1.709185555 podStartE2EDuration="1.709185555s" podCreationTimestamp="2026-04-22 21:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:08:42.708796934 +0000 UTC m=+3.618097646" watchObservedRunningTime="2026-04-22 21:08:42.709185555 +0000 UTC m=+3.618486266" Apr 22 21:08:42.734878 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.734374 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-kubelet-config\") pod \"global-pull-secret-syncer-flkgv\" (UID: \"e49b46d0-3a94-438f-bdc2-b93ad7f84cde\") " pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:42.734878 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.734442 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret\") pod \"global-pull-secret-syncer-flkgv\" (UID: \"e49b46d0-3a94-438f-bdc2-b93ad7f84cde\") " pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:42.734878 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.734499 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-dbus\") pod \"global-pull-secret-syncer-flkgv\" (UID: \"e49b46d0-3a94-438f-bdc2-b93ad7f84cde\") " pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:42.835448 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.835385 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-dbus\") pod \"global-pull-secret-syncer-flkgv\" (UID: \"e49b46d0-3a94-438f-bdc2-b93ad7f84cde\") " pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:42.835448 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.835450 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-kubelet-config\") pod \"global-pull-secret-syncer-flkgv\" (UID: \"e49b46d0-3a94-438f-bdc2-b93ad7f84cde\") " pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:42.835685 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.835494 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret\") pod \"global-pull-secret-syncer-flkgv\" (UID: \"e49b46d0-3a94-438f-bdc2-b93ad7f84cde\") " pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:42.835685 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:42.835636 2546 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:08:42.835789 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:42.835715 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret podName:e49b46d0-3a94-438f-bdc2-b93ad7f84cde nodeName:}" failed. No retries permitted until 2026-04-22 21:08:43.33569623 +0000 UTC m=+4.244996927 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret") pod "global-pull-secret-syncer-flkgv" (UID: "e49b46d0-3a94-438f-bdc2-b93ad7f84cde") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:08:42.836075 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.836055 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-dbus\") pod \"global-pull-secret-syncer-flkgv\" (UID: \"e49b46d0-3a94-438f-bdc2-b93ad7f84cde\") " pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:42.836156 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:42.836124 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-kubelet-config\") pod \"global-pull-secret-syncer-flkgv\" (UID: \"e49b46d0-3a94-438f-bdc2-b93ad7f84cde\") " pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:43.239805 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:43.239773 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs\") pod \"network-metrics-daemon-4476p\" (UID: \"046a767f-6fe0-4c57-ba82-6a2ce75f7864\") " pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:43.239970 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:43.239902 2546 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:43.239970 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:43.239958 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs podName:046a767f-6fe0-4c57-ba82-6a2ce75f7864 nodeName:}" failed. No retries permitted until 2026-04-22 21:08:45.239941286 +0000 UTC m=+6.149241988 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs") pod "network-metrics-daemon-4476p" (UID: "046a767f-6fe0-4c57-ba82-6a2ce75f7864") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:43.340788 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:43.340755 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret\") pod \"global-pull-secret-syncer-flkgv\" (UID: \"e49b46d0-3a94-438f-bdc2-b93ad7f84cde\") " pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:43.340935 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:43.340914 2546 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:08:43.340997 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:43.340976 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret podName:e49b46d0-3a94-438f-bdc2-b93ad7f84cde nodeName:}" failed. No retries permitted until 2026-04-22 21:08:44.340958038 +0000 UTC m=+5.250258739 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret") pod "global-pull-secret-syncer-flkgv" (UID: "e49b46d0-3a94-438f-bdc2-b93ad7f84cde") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:08:43.441379 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:43.441344 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x98kt\" (UniqueName: \"kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt\") pod \"network-check-target-hm4qp\" (UID: \"dc5617e8-2aaf-4671-9fab-db6988f9b8b4\") " pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:08:43.441566 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:43.441549 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:08:43.441627 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:43.441576 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:08:43.441627 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:43.441589 2546 projected.go:194] Error preparing data for projected volume kube-api-access-x98kt for pod openshift-network-diagnostics/network-check-target-hm4qp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:43.441771 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:43.441663 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt podName:dc5617e8-2aaf-4671-9fab-db6988f9b8b4 nodeName:}" failed. No retries permitted until 2026-04-22 21:08:45.441626376 +0000 UTC m=+6.350927072 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-x98kt" (UniqueName: "kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt") pod "network-check-target-hm4qp" (UID: "dc5617e8-2aaf-4671-9fab-db6988f9b8b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:43.676778 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:43.676044 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:08:43.676778 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:43.676163 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hm4qp" podUID="dc5617e8-2aaf-4671-9fab-db6988f9b8b4" Apr 22 21:08:43.676778 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:43.676617 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:43.676778 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:43.676739 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4476p" podUID="046a767f-6fe0-4c57-ba82-6a2ce75f7864" Apr 22 21:08:43.723693 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:43.723379 2546 generic.go:358] "Generic (PLEG): container finished" podID="87b55359ad2fcf9ee78f0e3dd4c3711d" containerID="7a226d3a314fc11af2927011a4221e75cbce389d44090ce507964d5f21281404" exitCode=0 Apr 22 21:08:43.724367 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:43.724296 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" event={"ID":"87b55359ad2fcf9ee78f0e3dd4c3711d","Type":"ContainerDied","Data":"7a226d3a314fc11af2927011a4221e75cbce389d44090ce507964d5f21281404"} Apr 22 21:08:44.351092 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:44.351052 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret\") pod \"global-pull-secret-syncer-flkgv\" (UID: \"e49b46d0-3a94-438f-bdc2-b93ad7f84cde\") " pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:44.351274 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:44.351247 2546 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:08:44.351348 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:44.351317 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret podName:e49b46d0-3a94-438f-bdc2-b93ad7f84cde nodeName:}" failed. No retries permitted until 2026-04-22 21:08:46.351297792 +0000 UTC m=+7.260598504 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret") pod "global-pull-secret-syncer-flkgv" (UID: "e49b46d0-3a94-438f-bdc2-b93ad7f84cde") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:08:44.676323 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:44.675729 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:44.676323 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:44.675884 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-flkgv" podUID="e49b46d0-3a94-438f-bdc2-b93ad7f84cde" Apr 22 21:08:44.729378 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:44.729341 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" event={"ID":"87b55359ad2fcf9ee78f0e3dd4c3711d","Type":"ContainerStarted","Data":"0dabbb8cadb1afc979336eb9328f9645ebfd27b589b550149900fd7d73f2ee36"} Apr 22 21:08:44.741617 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:44.741162 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" podStartSLOduration=3.74114349 podStartE2EDuration="3.74114349s" podCreationTimestamp="2026-04-22 21:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:08:44.740808515 +0000 UTC m=+5.650109229" watchObservedRunningTime="2026-04-22 21:08:44.74114349 +0000 UTC m=+5.650444201" Apr 22 21:08:45.258976 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:45.258934 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs\") pod \"network-metrics-daemon-4476p\" (UID: \"046a767f-6fe0-4c57-ba82-6a2ce75f7864\") " pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:45.259144 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:45.259115 2546 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:45.259205 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:45.259184 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs podName:046a767f-6fe0-4c57-ba82-6a2ce75f7864 nodeName:}" failed. No retries permitted until 2026-04-22 21:08:49.259165212 +0000 UTC m=+10.168465914 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs") pod "network-metrics-daemon-4476p" (UID: "046a767f-6fe0-4c57-ba82-6a2ce75f7864") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:45.460290 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:45.460242 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x98kt\" (UniqueName: \"kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt\") pod \"network-check-target-hm4qp\" (UID: \"dc5617e8-2aaf-4671-9fab-db6988f9b8b4\") " pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:08:45.460461 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:45.460370 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:08:45.460461 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:45.460385 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:08:45.460461 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:45.460394 2546 projected.go:194] Error preparing data for projected volume kube-api-access-x98kt for pod openshift-network-diagnostics/network-check-target-hm4qp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:45.460461 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:45.460439 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt podName:dc5617e8-2aaf-4671-9fab-db6988f9b8b4 nodeName:}" failed. No retries permitted until 2026-04-22 21:08:49.460422639 +0000 UTC m=+10.369723328 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-x98kt" (UniqueName: "kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt") pod "network-check-target-hm4qp" (UID: "dc5617e8-2aaf-4671-9fab-db6988f9b8b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:45.678547 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:45.678514 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:08:45.678730 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:45.678660 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hm4qp" podUID="dc5617e8-2aaf-4671-9fab-db6988f9b8b4" Apr 22 21:08:45.679127 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:45.679106 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:45.679269 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:45.679248 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4476p" podUID="046a767f-6fe0-4c57-ba82-6a2ce75f7864" Apr 22 21:08:46.368237 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:46.368201 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret\") pod \"global-pull-secret-syncer-flkgv\" (UID: \"e49b46d0-3a94-438f-bdc2-b93ad7f84cde\") " pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:46.368693 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:46.368340 2546 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:08:46.368693 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:46.368402 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret podName:e49b46d0-3a94-438f-bdc2-b93ad7f84cde nodeName:}" failed. No retries permitted until 2026-04-22 21:08:50.368382442 +0000 UTC m=+11.277683144 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret") pod "global-pull-secret-syncer-flkgv" (UID: "e49b46d0-3a94-438f-bdc2-b93ad7f84cde") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:08:46.676286 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:46.676206 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:46.676447 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:46.676342 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-flkgv" podUID="e49b46d0-3a94-438f-bdc2-b93ad7f84cde" Apr 22 21:08:47.675829 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:47.675788 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:08:47.675829 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:47.675819 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:47.676302 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:47.675918 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hm4qp" podUID="dc5617e8-2aaf-4671-9fab-db6988f9b8b4" Apr 22 21:08:47.676302 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:47.676034 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4476p" podUID="046a767f-6fe0-4c57-ba82-6a2ce75f7864" Apr 22 21:08:48.676284 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:48.676241 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:48.676719 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:48.676383 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-flkgv" podUID="e49b46d0-3a94-438f-bdc2-b93ad7f84cde" Apr 22 21:08:49.294587 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:49.294511 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs\") pod \"network-metrics-daemon-4476p\" (UID: \"046a767f-6fe0-4c57-ba82-6a2ce75f7864\") " pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:49.294773 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:49.294698 2546 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:49.294842 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:49.294782 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs podName:046a767f-6fe0-4c57-ba82-6a2ce75f7864 nodeName:}" failed. No retries permitted until 2026-04-22 21:08:57.294760302 +0000 UTC m=+18.204061001 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs") pod "network-metrics-daemon-4476p" (UID: "046a767f-6fe0-4c57-ba82-6a2ce75f7864") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:49.496152 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:49.496108 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x98kt\" (UniqueName: \"kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt\") pod \"network-check-target-hm4qp\" (UID: \"dc5617e8-2aaf-4671-9fab-db6988f9b8b4\") " pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:08:49.496324 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:49.496266 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:08:49.496324 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:49.496285 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:08:49.496324 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:49.496297 2546 projected.go:194] Error preparing data for projected volume kube-api-access-x98kt for pod openshift-network-diagnostics/network-check-target-hm4qp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:49.496473 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:49.496354 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt podName:dc5617e8-2aaf-4671-9fab-db6988f9b8b4 nodeName:}" failed. No retries permitted until 2026-04-22 21:08:57.496335055 +0000 UTC m=+18.405635758 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-x98kt" (UniqueName: "kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt") pod "network-check-target-hm4qp" (UID: "dc5617e8-2aaf-4671-9fab-db6988f9b8b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:49.679043 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:49.677273 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:08:49.679043 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:49.677408 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hm4qp" podUID="dc5617e8-2aaf-4671-9fab-db6988f9b8b4" Apr 22 21:08:49.679043 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:49.677852 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:49.679043 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:49.677995 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4476p" podUID="046a767f-6fe0-4c57-ba82-6a2ce75f7864" Apr 22 21:08:50.402582 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:50.402546 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret\") pod \"global-pull-secret-syncer-flkgv\" (UID: \"e49b46d0-3a94-438f-bdc2-b93ad7f84cde\") " pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:50.402779 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:50.402761 2546 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:08:50.402834 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:50.402826 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret podName:e49b46d0-3a94-438f-bdc2-b93ad7f84cde nodeName:}" failed. No retries permitted until 2026-04-22 21:08:58.402807593 +0000 UTC m=+19.312108288 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret") pod "global-pull-secret-syncer-flkgv" (UID: "e49b46d0-3a94-438f-bdc2-b93ad7f84cde") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:08:50.676397 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:50.676295 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:50.676528 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:50.676457 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-flkgv" podUID="e49b46d0-3a94-438f-bdc2-b93ad7f84cde" Apr 22 21:08:51.676335 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:51.676301 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:51.676790 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:51.676342 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:08:51.676790 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:51.676429 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4476p" podUID="046a767f-6fe0-4c57-ba82-6a2ce75f7864" Apr 22 21:08:51.676790 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:51.676557 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hm4qp" podUID="dc5617e8-2aaf-4671-9fab-db6988f9b8b4" Apr 22 21:08:52.676045 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:52.676015 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:52.676244 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:52.676115 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-flkgv" podUID="e49b46d0-3a94-438f-bdc2-b93ad7f84cde" Apr 22 21:08:53.676214 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:53.676173 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:08:53.676608 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:53.676230 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:53.676608 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:53.676333 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hm4qp" podUID="dc5617e8-2aaf-4671-9fab-db6988f9b8b4" Apr 22 21:08:53.676608 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:53.676430 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4476p" podUID="046a767f-6fe0-4c57-ba82-6a2ce75f7864" Apr 22 21:08:54.675626 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:54.675594 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:54.675799 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:54.675716 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-flkgv" podUID="e49b46d0-3a94-438f-bdc2-b93ad7f84cde" Apr 22 21:08:55.676427 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:55.676392 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:55.676872 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:55.676434 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:08:55.676872 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:55.676535 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4476p" podUID="046a767f-6fe0-4c57-ba82-6a2ce75f7864" Apr 22 21:08:55.676872 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:55.676633 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hm4qp" podUID="dc5617e8-2aaf-4671-9fab-db6988f9b8b4" Apr 22 21:08:56.676203 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:56.676169 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:56.676402 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:56.676302 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-flkgv" podUID="e49b46d0-3a94-438f-bdc2-b93ad7f84cde" Apr 22 21:08:57.353608 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:57.353576 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs\") pod \"network-metrics-daemon-4476p\" (UID: \"046a767f-6fe0-4c57-ba82-6a2ce75f7864\") " pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:57.354048 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:57.353751 2546 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:57.354048 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:57.353811 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs podName:046a767f-6fe0-4c57-ba82-6a2ce75f7864 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:13.353794677 +0000 UTC m=+34.263095370 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs") pod "network-metrics-daemon-4476p" (UID: "046a767f-6fe0-4c57-ba82-6a2ce75f7864") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:57.555719 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:57.555682 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x98kt\" (UniqueName: \"kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt\") pod \"network-check-target-hm4qp\" (UID: \"dc5617e8-2aaf-4671-9fab-db6988f9b8b4\") " pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:08:57.555955 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:57.555873 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:08:57.555955 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:57.555897 2546 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:08:57.555955 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:57.555908 2546 projected.go:194] Error preparing data for projected volume kube-api-access-x98kt for pod openshift-network-diagnostics/network-check-target-hm4qp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:57.556110 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:57.555967 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt podName:dc5617e8-2aaf-4671-9fab-db6988f9b8b4 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:13.555945931 +0000 UTC m=+34.465246639 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-x98kt" (UniqueName: "kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt") pod "network-check-target-hm4qp" (UID: "dc5617e8-2aaf-4671-9fab-db6988f9b8b4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:57.676471 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:57.676390 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:08:57.676634 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:57.676500 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hm4qp" podUID="dc5617e8-2aaf-4671-9fab-db6988f9b8b4" Apr 22 21:08:57.676634 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:57.676538 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:57.676634 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:57.676620 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4476p" podUID="046a767f-6fe0-4c57-ba82-6a2ce75f7864" Apr 22 21:08:58.461797 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:58.461740 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret\") pod \"global-pull-secret-syncer-flkgv\" (UID: \"e49b46d0-3a94-438f-bdc2-b93ad7f84cde\") " pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:58.462179 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:58.461877 2546 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:08:58.462179 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:58.461961 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret podName:e49b46d0-3a94-438f-bdc2-b93ad7f84cde nodeName:}" failed. No retries permitted until 2026-04-22 21:09:14.461942033 +0000 UTC m=+35.371242734 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret") pod "global-pull-secret-syncer-flkgv" (UID: "e49b46d0-3a94-438f-bdc2-b93ad7f84cde") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:08:58.676430 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:58.676394 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:08:58.676603 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:58.676501 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-flkgv" podUID="e49b46d0-3a94-438f-bdc2-b93ad7f84cde" Apr 22 21:08:59.676891 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.676555 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:08:59.677772 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.676624 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:08:59.677772 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:59.676949 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hm4qp" podUID="dc5617e8-2aaf-4671-9fab-db6988f9b8b4" Apr 22 21:08:59.677772 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:08:59.677044 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4476p" podUID="046a767f-6fe0-4c57-ba82-6a2ce75f7864" Apr 22 21:08:59.756748 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.756728 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/0.log" Apr 22 21:08:59.757040 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.757018 2546 generic.go:358] "Generic (PLEG): container finished" podID="195f14c0-2e98-4e09-8198-0f6696d61fc6" containerID="5d09e90db7c0671f82d3b57d5b3b831c32e755d0dc640d5c7c200b9a6e816e3a" exitCode=1 Apr 22 21:08:59.757130 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.757091 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" event={"ID":"195f14c0-2e98-4e09-8198-0f6696d61fc6","Type":"ContainerStarted","Data":"89c116d59e412c72f8ebc75c487bd1340e60ca54858f375439cb7779099d40d3"} Apr 22 21:08:59.757130 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.757122 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" event={"ID":"195f14c0-2e98-4e09-8198-0f6696d61fc6","Type":"ContainerStarted","Data":"e681a6768e318354ad044bb777daf30a4711caa459bbf3ec02d4ba3bb593e407"} Apr 22 21:08:59.757193 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.757131 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" event={"ID":"195f14c0-2e98-4e09-8198-0f6696d61fc6","Type":"ContainerDied","Data":"5d09e90db7c0671f82d3b57d5b3b831c32e755d0dc640d5c7c200b9a6e816e3a"} Apr 22 21:08:59.757193 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.757142 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" event={"ID":"195f14c0-2e98-4e09-8198-0f6696d61fc6","Type":"ContainerStarted","Data":"d494abd31bdce3de5fdbb48c35e05e785e5b3ca77f50592f055f9d62239bfb02"} Apr 22 21:08:59.758406 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.758384 2546 generic.go:358] "Generic (PLEG): container finished" podID="0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63" containerID="04c9f4f1afec7630082927a78caa01bf901380b28aae89bab5e1f799d9a29daf" exitCode=0 Apr 22 21:08:59.758510 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.758471 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5b9n" event={"ID":"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63","Type":"ContainerDied","Data":"04c9f4f1afec7630082927a78caa01bf901380b28aae89bab5e1f799d9a29daf"} Apr 22 21:08:59.759997 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.759758 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" event={"ID":"3ae026b3-22f4-47d0-a136-0a0697546e33","Type":"ContainerStarted","Data":"8464c531f4542bbb432508df3f6f493e3cd595d1fcaeb080022020d44b6468d3"} Apr 22 21:08:59.761329 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.761306 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tdtws" event={"ID":"aab52e3e-8c7f-4bd0-92a2-9f6136daf78d","Type":"ContainerStarted","Data":"364cac53bf254c5065c7c292282e252457920c929302871b8b705fc1472a0dce"} Apr 22 21:08:59.763001 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.762984 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zbnft" event={"ID":"ee933643-0139-4bfd-9d0d-903d3404df30","Type":"ContainerStarted","Data":"f729fea680b2790d8d31216dbb64eeb19fc19c45d05fccdf5f782817c246f516"} Apr 22 21:08:59.765409 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.765387 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l25nv" event={"ID":"a782077c-00c8-4baa-8f9c-ce021c1c5d9c","Type":"ContainerStarted","Data":"e261f1a5dda59e2fd53c4833c89afb9739798748d4c5a8731a8936e29af232e5"} Apr 22 21:08:59.766635 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.766609 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" event={"ID":"ea7b080b-6583-467a-8398-4eb854c288f4","Type":"ContainerStarted","Data":"47f8ad6717270c42e3ac189f18becb1fd3d9ac25ffc1998a8bdd31a5908c875d"} Apr 22 21:08:59.782835 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.782805 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zbnft" podStartSLOduration=2.979879401 podStartE2EDuration="19.782794262s" podCreationTimestamp="2026-04-22 21:08:40 +0000 UTC" firstStartedPulling="2026-04-22 21:08:42.329521047 +0000 UTC m=+3.238821736" lastFinishedPulling="2026-04-22 21:08:59.132435905 +0000 UTC m=+20.041736597" observedRunningTime="2026-04-22 21:08:59.782714477 +0000 UTC m=+20.692015188" watchObservedRunningTime="2026-04-22 21:08:59.782794262 +0000 UTC m=+20.692094973" Apr 22 21:08:59.794416 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.794377 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tdtws" podStartSLOduration=3.965041809 podStartE2EDuration="20.794362443s" podCreationTimestamp="2026-04-22 21:08:39 +0000 UTC" firstStartedPulling="2026-04-22 21:08:42.340171588 +0000 UTC m=+3.249472291" lastFinishedPulling="2026-04-22 21:08:59.169492234 +0000 UTC m=+20.078792925" observedRunningTime="2026-04-22 21:08:59.794340521 +0000 UTC m=+20.703641232" watchObservedRunningTime="2026-04-22 21:08:59.794362443 +0000 UTC m=+20.703663154" Apr 22 21:08:59.806534 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.806496 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-l25nv" podStartSLOduration=4.003184291 podStartE2EDuration="20.806485172s" podCreationTimestamp="2026-04-22 21:08:39 +0000 UTC" firstStartedPulling="2026-04-22 21:08:42.329231134 +0000 UTC m=+3.238531823" lastFinishedPulling="2026-04-22 21:08:59.132532003 +0000 UTC m=+20.041832704" observedRunningTime="2026-04-22 21:08:59.806065935 +0000 UTC m=+20.715366646" watchObservedRunningTime="2026-04-22 21:08:59.806485172 +0000 UTC m=+20.715785918" Apr 22 21:08:59.818026 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:08:59.817986 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-dgcb8" podStartSLOduration=4.027619074 podStartE2EDuration="20.817972166s" podCreationTimestamp="2026-04-22 21:08:39 +0000 UTC" firstStartedPulling="2026-04-22 21:08:42.342174045 +0000 UTC m=+3.251474735" lastFinishedPulling="2026-04-22 21:08:59.132527123 +0000 UTC m=+20.041827827" observedRunningTime="2026-04-22 21:08:59.817520443 +0000 UTC m=+20.726821154" watchObservedRunningTime="2026-04-22 21:08:59.817972166 +0000 UTC m=+20.727272877" Apr 22 21:09:00.676048 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:00.675991 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:09:00.676176 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:00.676145 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-flkgv" podUID="e49b46d0-3a94-438f-bdc2-b93ad7f84cde" Apr 22 21:09:00.771404 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:00.771381 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/0.log" Apr 22 21:09:00.771937 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:00.771850 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" event={"ID":"195f14c0-2e98-4e09-8198-0f6696d61fc6","Type":"ContainerStarted","Data":"f3ea860067677a4f9e3e72d441a5d0754fc8c9b345123c62d2cc6005e08816d0"} Apr 22 21:09:00.771937 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:00.771885 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" event={"ID":"195f14c0-2e98-4e09-8198-0f6696d61fc6","Type":"ContainerStarted","Data":"a8af06d4baf8bdb85722a78c25465647fbf1c67956cb5be024e05c6d8ff25674"} Apr 22 21:09:00.773844 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:00.773402 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f6hmc" event={"ID":"e2cf1b9f-156b-4e0e-aad4-c6a8d96cb14d","Type":"ContainerStarted","Data":"46b2fb9cf53a31fb24ec100f16de3325c31a703e41d143070bd976d3cd1f660b"} Apr 22 21:09:00.783981 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:00.783937 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-f6hmc" podStartSLOduration=4.990372213 podStartE2EDuration="21.783923034s" podCreationTimestamp="2026-04-22 21:08:39 +0000 UTC" firstStartedPulling="2026-04-22 21:08:42.338896312 +0000 UTC m=+3.248197007" lastFinishedPulling="2026-04-22 21:08:59.132447136 +0000 UTC m=+20.041747828" observedRunningTime="2026-04-22 21:09:00.783848336 +0000 UTC m=+21.693149047" watchObservedRunningTime="2026-04-22 21:09:00.783923034 +0000 UTC m=+21.693223748" Apr 22 21:09:00.832870 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:00.832834 2546 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 21:09:01.203818 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.203780 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-l25nv" Apr 22 21:09:01.204403 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.204383 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-l25nv" Apr 22 21:09:01.259145 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.259112 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-l25nv" Apr 22 21:09:01.259819 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.259799 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-l25nv" Apr 22 21:09:01.466907 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.466834 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-v6vgf"] Apr 22 21:09:01.477095 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.475731 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v6vgf" Apr 22 21:09:01.478553 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.478265 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-q7c24\"" Apr 22 21:09:01.478553 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.478291 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 21:09:01.478553 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.478421 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 21:09:01.587339 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.587299 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fbb54413-9542-4220-90ec-df660b4891fa-tmp-dir\") pod \"node-resolver-v6vgf\" (UID: \"fbb54413-9542-4220-90ec-df660b4891fa\") " pod="openshift-dns/node-resolver-v6vgf" Apr 22 21:09:01.587536 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.587356 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x7t5\" (UniqueName: \"kubernetes.io/projected/fbb54413-9542-4220-90ec-df660b4891fa-kube-api-access-2x7t5\") pod \"node-resolver-v6vgf\" (UID: \"fbb54413-9542-4220-90ec-df660b4891fa\") " pod="openshift-dns/node-resolver-v6vgf" Apr 22 21:09:01.587536 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.587401 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fbb54413-9542-4220-90ec-df660b4891fa-hosts-file\") pod \"node-resolver-v6vgf\" (UID: \"fbb54413-9542-4220-90ec-df660b4891fa\") " pod="openshift-dns/node-resolver-v6vgf" Apr 22 21:09:01.675943 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.675902 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:09:01.676122 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:01.676019 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hm4qp" podUID="dc5617e8-2aaf-4671-9fab-db6988f9b8b4" Apr 22 21:09:01.676122 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.676052 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:09:01.676241 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:01.676175 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4476p" podUID="046a767f-6fe0-4c57-ba82-6a2ce75f7864" Apr 22 21:09:01.676947 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.676853 2546 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T21:09:00.832853966Z","UUID":"cf24fe9e-3148-44e7-bbbe-d7604db46a96","Handler":null,"Name":"","Endpoint":""} Apr 22 21:09:01.678765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.678732 2546 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 21:09:01.678765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.678765 2546 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 21:09:01.688186 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.688159 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2x7t5\" (UniqueName: \"kubernetes.io/projected/fbb54413-9542-4220-90ec-df660b4891fa-kube-api-access-2x7t5\") pod \"node-resolver-v6vgf\" (UID: \"fbb54413-9542-4220-90ec-df660b4891fa\") " pod="openshift-dns/node-resolver-v6vgf" Apr 22 21:09:01.688326 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.688193 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fbb54413-9542-4220-90ec-df660b4891fa-hosts-file\") pod \"node-resolver-v6vgf\" (UID: \"fbb54413-9542-4220-90ec-df660b4891fa\") " pod="openshift-dns/node-resolver-v6vgf" Apr 22 21:09:01.688326 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.688289 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fbb54413-9542-4220-90ec-df660b4891fa-tmp-dir\") pod \"node-resolver-v6vgf\" (UID: \"fbb54413-9542-4220-90ec-df660b4891fa\") " pod="openshift-dns/node-resolver-v6vgf" Apr 22 21:09:01.688502 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.688480 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fbb54413-9542-4220-90ec-df660b4891fa-hosts-file\") pod \"node-resolver-v6vgf\" (UID: \"fbb54413-9542-4220-90ec-df660b4891fa\") " pod="openshift-dns/node-resolver-v6vgf" Apr 22 21:09:01.688605 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.688591 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fbb54413-9542-4220-90ec-df660b4891fa-tmp-dir\") pod \"node-resolver-v6vgf\" (UID: \"fbb54413-9542-4220-90ec-df660b4891fa\") " pod="openshift-dns/node-resolver-v6vgf" Apr 22 21:09:01.697197 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.697171 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x7t5\" (UniqueName: \"kubernetes.io/projected/fbb54413-9542-4220-90ec-df660b4891fa-kube-api-access-2x7t5\") pod \"node-resolver-v6vgf\" (UID: \"fbb54413-9542-4220-90ec-df660b4891fa\") " pod="openshift-dns/node-resolver-v6vgf" Apr 22 21:09:01.776733 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.776655 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" event={"ID":"3ae026b3-22f4-47d0-a136-0a0697546e33","Type":"ContainerStarted","Data":"3131b999b6237c28d6f3baaaddba05f43a59c969f5393a379be759a72b62c641"} Apr 22 21:09:01.787162 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:01.787136 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v6vgf" Apr 22 21:09:01.978489 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:01.978465 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbb54413_9542_4220_90ec_df660b4891fa.slice/crio-2d76ad86d9aff211cfb3c092cdd602ed22462be4536098d00edc1e69c096c49e WatchSource:0}: Error finding container 2d76ad86d9aff211cfb3c092cdd602ed22462be4536098d00edc1e69c096c49e: Status 404 returned error can't find the container with id 2d76ad86d9aff211cfb3c092cdd602ed22462be4536098d00edc1e69c096c49e Apr 22 21:09:02.676623 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:02.676439 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:09:02.676856 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:02.676736 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-flkgv" podUID="e49b46d0-3a94-438f-bdc2-b93ad7f84cde" Apr 22 21:09:02.781619 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:02.781584 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/0.log" Apr 22 21:09:02.782146 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:02.782062 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" event={"ID":"195f14c0-2e98-4e09-8198-0f6696d61fc6","Type":"ContainerStarted","Data":"a130d370678e36f4e15488625a484add4cd6875aa594bc9ac5df9b85d4532710"} Apr 22 21:09:02.784153 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:02.784127 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" event={"ID":"3ae026b3-22f4-47d0-a136-0a0697546e33","Type":"ContainerStarted","Data":"9b8de5c0e8d206aa2215ab484b59c35b6780ebc723c4adea052f7816d4a1318e"} Apr 22 21:09:02.785536 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:02.785511 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v6vgf" event={"ID":"fbb54413-9542-4220-90ec-df660b4891fa","Type":"ContainerStarted","Data":"92f86360b347dafcd847826f61bcce2996ee897ff2ffbd83bb1708b90f9319d7"} Apr 22 21:09:02.785631 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:02.785548 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v6vgf" event={"ID":"fbb54413-9542-4220-90ec-df660b4891fa","Type":"ContainerStarted","Data":"2d76ad86d9aff211cfb3c092cdd602ed22462be4536098d00edc1e69c096c49e"} Apr 22 21:09:02.800009 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:02.799936 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2lljd" podStartSLOduration=3.163469671 podStartE2EDuration="22.799920525s" podCreationTimestamp="2026-04-22 21:08:40 +0000 UTC" firstStartedPulling="2026-04-22 21:08:42.339075231 +0000 UTC m=+3.248375925" lastFinishedPulling="2026-04-22 21:09:01.975526081 +0000 UTC m=+22.884826779" observedRunningTime="2026-04-22 21:09:02.79906191 +0000 UTC m=+23.708362625" watchObservedRunningTime="2026-04-22 21:09:02.799920525 +0000 UTC m=+23.709221237" Apr 22 21:09:03.675737 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:03.675707 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:09:03.675924 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:03.675715 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:09:03.675924 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:03.675824 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hm4qp" podUID="dc5617e8-2aaf-4671-9fab-db6988f9b8b4" Apr 22 21:09:03.675924 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:03.675891 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4476p" podUID="046a767f-6fe0-4c57-ba82-6a2ce75f7864" Apr 22 21:09:04.675715 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:04.675681 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:09:04.676290 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:04.675793 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-flkgv" podUID="e49b46d0-3a94-438f-bdc2-b93ad7f84cde" Apr 22 21:09:05.675737 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:05.675457 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:09:05.676203 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:05.675472 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:09:05.676203 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:05.675768 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hm4qp" podUID="dc5617e8-2aaf-4671-9fab-db6988f9b8b4" Apr 22 21:09:05.676203 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:05.675867 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4476p" podUID="046a767f-6fe0-4c57-ba82-6a2ce75f7864" Apr 22 21:09:05.792837 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:05.792803 2546 generic.go:358] "Generic (PLEG): container finished" podID="0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63" containerID="cb903bac3f90f4a7402c79aa5ed2a165d0b24a6503164175c6736b80b60cd757" exitCode=0 Apr 22 21:09:05.792993 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:05.792896 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5b9n" event={"ID":"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63","Type":"ContainerDied","Data":"cb903bac3f90f4a7402c79aa5ed2a165d0b24a6503164175c6736b80b60cd757"} Apr 22 21:09:05.795835 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:05.795818 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/0.log" Apr 22 21:09:05.796125 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:05.796107 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" event={"ID":"195f14c0-2e98-4e09-8198-0f6696d61fc6","Type":"ContainerStarted","Data":"026f03e384d660d7b6005193e09218a9ac6fa7e2ee38b9af22020ab62e66c177"} Apr 22 21:09:05.796430 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:05.796412 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:09:05.796506 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:05.796441 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:09:05.796506 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:05.796453 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:09:05.796682 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:05.796663 2546 scope.go:117] "RemoveContainer" containerID="5d09e90db7c0671f82d3b57d5b3b831c32e755d0dc640d5c7c200b9a6e816e3a" Apr 22 21:09:05.811574 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:05.811557 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:09:05.812001 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:05.811985 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:09:05.812995 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:05.812954 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-v6vgf" podStartSLOduration=4.812938864 podStartE2EDuration="4.812938864s" podCreationTimestamp="2026-04-22 21:09:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:09:02.811099688 +0000 UTC m=+23.720400400" watchObservedRunningTime="2026-04-22 21:09:05.812938864 +0000 UTC m=+26.722239576" Apr 22 21:09:06.675576 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:06.675551 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:09:06.675688 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:06.675667 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-flkgv" podUID="e49b46d0-3a94-438f-bdc2-b93ad7f84cde" Apr 22 21:09:06.801353 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:06.801324 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/0.log" Apr 22 21:09:06.801801 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:06.801684 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" event={"ID":"195f14c0-2e98-4e09-8198-0f6696d61fc6","Type":"ContainerStarted","Data":"1b48a09a9f23367cd1818a0697f7cb303bb3c2bb5375118b5cc6fa8d0de687b1"} Apr 22 21:09:06.803583 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:06.803561 2546 generic.go:358] "Generic (PLEG): container finished" podID="0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63" containerID="b34c5ad05f6aa181689ff6cf686d4d39bfda79fc508fc15374a854ddabe2a4c3" exitCode=0 Apr 22 21:09:06.803731 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:06.803607 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5b9n" event={"ID":"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63","Type":"ContainerDied","Data":"b34c5ad05f6aa181689ff6cf686d4d39bfda79fc508fc15374a854ddabe2a4c3"} Apr 22 21:09:06.826282 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:06.826237 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" podStartSLOduration=10.779479438 podStartE2EDuration="27.826223095s" podCreationTimestamp="2026-04-22 21:08:39 +0000 UTC" firstStartedPulling="2026-04-22 21:08:42.340933089 +0000 UTC m=+3.250233790" lastFinishedPulling="2026-04-22 21:08:59.387676748 +0000 UTC m=+20.296977447" observedRunningTime="2026-04-22 21:09:06.825725853 +0000 UTC m=+27.735026565" watchObservedRunningTime="2026-04-22 21:09:06.826223095 +0000 UTC m=+27.735523822" Apr 22 21:09:07.114238 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:07.114126 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-flkgv"] Apr 22 21:09:07.114238 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:07.114219 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:09:07.114421 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:07.114298 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-flkgv" podUID="e49b46d0-3a94-438f-bdc2-b93ad7f84cde" Apr 22 21:09:07.116878 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:07.116633 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4476p"] Apr 22 21:09:07.116878 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:07.116758 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:09:07.116878 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:07.116839 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4476p" podUID="046a767f-6fe0-4c57-ba82-6a2ce75f7864" Apr 22 21:09:07.124276 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:07.124251 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hm4qp"] Apr 22 21:09:07.124413 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:07.124361 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:09:07.124476 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:07.124455 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hm4qp" podUID="dc5617e8-2aaf-4671-9fab-db6988f9b8b4" Apr 22 21:09:07.807344 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:07.807247 2546 generic.go:358] "Generic (PLEG): container finished" podID="0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63" containerID="ea219e40e0ecf1fed3bd20353d9226ba9116e091b843d7c99444778dd18da61e" exitCode=0 Apr 22 21:09:07.807344 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:07.807322 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5b9n" event={"ID":"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63","Type":"ContainerDied","Data":"ea219e40e0ecf1fed3bd20353d9226ba9116e091b843d7c99444778dd18da61e"} Apr 22 21:09:08.675875 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:08.675845 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:09:08.676004 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:08.675979 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4476p" podUID="046a767f-6fe0-4c57-ba82-6a2ce75f7864" Apr 22 21:09:08.676400 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:08.676381 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:09:08.676509 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:08.676476 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-flkgv" podUID="e49b46d0-3a94-438f-bdc2-b93ad7f84cde" Apr 22 21:09:08.676573 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:08.676529 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:09:08.676627 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:08.676601 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hm4qp" podUID="dc5617e8-2aaf-4671-9fab-db6988f9b8b4" Apr 22 21:09:10.676600 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:10.676402 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:09:10.677122 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:10.676465 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:09:10.677122 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:10.676719 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-flkgv" podUID="e49b46d0-3a94-438f-bdc2-b93ad7f84cde" Apr 22 21:09:10.677122 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:10.676481 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:09:10.677122 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:10.676825 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4476p" podUID="046a767f-6fe0-4c57-ba82-6a2ce75f7864" Apr 22 21:09:10.677122 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:10.676865 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hm4qp" podUID="dc5617e8-2aaf-4671-9fab-db6988f9b8b4" Apr 22 21:09:12.389059 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.389028 2546 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeReady" Apr 22 21:09:12.389565 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.389177 2546 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 21:09:12.419902 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.419870 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8576cd8d8-tvm8d"] Apr 22 21:09:12.424845 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.424820 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6"] Apr 22 21:09:12.425036 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.424993 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.427521 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.427499 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 21:09:12.427713 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.427636 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7"] Apr 22 21:09:12.427808 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.427733 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 21:09:12.427868 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.427845 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jsc9g\"" Apr 22 21:09:12.427919 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.427864 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6" Apr 22 21:09:12.428037 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.428014 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 21:09:12.429893 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.429874 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-2zh7q\"" Apr 22 21:09:12.430022 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.429962 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 21:09:12.430086 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.430030 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 21:09:12.430251 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.430231 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:09:12.430379 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.430365 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 21:09:12.431373 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.431355 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg"] Apr 22 21:09:12.431561 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.431529 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7" Apr 22 21:09:12.434526 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.433534 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 21:09:12.434526 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.433559 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 21:09:12.434526 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.433703 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:09:12.434526 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.433980 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-v2j4k\"" Apr 22 21:09:12.434526 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.433991 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c"] Apr 22 21:09:12.434526 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.434096 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg" Apr 22 21:09:12.435683 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.434662 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 21:09:12.436583 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.436564 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:09:12.436820 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.436804 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 21:09:12.437019 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.436999 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-66pzw\"" Apr 22 21:09:12.437226 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.437206 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 21:09:12.437319 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.437260 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 21:09:12.438063 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.438044 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2rdf2"] Apr 22 21:09:12.438190 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.438175 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" Apr 22 21:09:12.443170 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.441380 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 21:09:12.443170 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.441580 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 21:09:12.443170 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.441843 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 21:09:12.443170 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.442405 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-bxtgl\"" Apr 22 21:09:12.443526 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.443323 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 21:09:12.453987 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.450443 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-98ckw"] Apr 22 21:09:12.453987 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.450782 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" Apr 22 21:09:12.453987 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.453015 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 21:09:12.453987 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.453528 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:09:12.453987 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.453539 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 21:09:12.453987 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.453587 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-44nf9"] Apr 22 21:09:12.453987 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.453689 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-58fdw\"" Apr 22 21:09:12.453987 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.453733 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 21:09:12.453987 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.453992 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-98ckw" Apr 22 21:09:12.456498 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.456442 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-zft7v\"" Apr 22 21:09:12.456688 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.456667 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 21:09:12.457408 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.457391 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:09:12.457578 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.457556 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-264s7"] Apr 22 21:09:12.461844 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.460550 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-569c57978d-cwfqs"] Apr 22 21:09:12.461844 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.461221 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-264s7" Apr 22 21:09:12.462529 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.462500 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 21:09:12.464853 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.464229 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-lt772"] Apr 22 21:09:12.464853 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.464332 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.465495 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.465473 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 21:09:12.465582 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.465538 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7ll98\"" Apr 22 21:09:12.465777 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.465744 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 21:09:12.466542 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.466523 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-bt2gr\"" Apr 22 21:09:12.466683 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.466592 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 21:09:12.466761 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.466529 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 21:09:12.466761 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.466696 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 21:09:12.468182 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.468052 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 21:09:12.469017 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.468997 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6"] Apr 22 21:09:12.469124 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.469027 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xbm9h"] Apr 22 21:09:12.470670 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.469679 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lt772" Apr 22 21:09:12.471518 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.471500 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 21:09:12.472476 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.472456 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7"] Apr 22 21:09:12.472563 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.472482 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-hqxdd"] Apr 22 21:09:12.473201 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.473180 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xbm9h" Apr 22 21:09:12.479960 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.479940 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 21:09:12.480251 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.480229 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 21:09:12.480480 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.480463 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-gtqgh\"" Apr 22 21:09:12.482017 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.481995 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 21:09:12.482224 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.482202 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 21:09:12.482472 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.482452 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg"] Apr 22 21:09:12.482548 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.482491 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c"] Apr 22 21:09:12.482548 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.482507 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8576cd8d8-tvm8d"] Apr 22 21:09:12.482548 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.482520 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2rdf2"] Apr 22 21:09:12.482548 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.482531 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-44nf9"] Apr 22 21:09:12.482548 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.482542 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-hqxdd"] Apr 22 21:09:12.482803 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.482556 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-569c57978d-cwfqs"] Apr 22 21:09:12.482803 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.482567 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-lt772"] Apr 22 21:09:12.482803 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.482591 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-98ckw"] Apr 22 21:09:12.482803 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.482602 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xbm9h"] Apr 22 21:09:12.482803 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.482615 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-264s7"] Apr 22 21:09:12.482803 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.482741 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hqxdd" Apr 22 21:09:12.483117 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.482841 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:12.483117 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.482870 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 21:09:12.483718 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.483702 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-c86tt\"" Apr 22 21:09:12.485599 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.485556 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 21:09:12.485804 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.485781 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-tr46n\"" Apr 22 21:09:12.485897 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.485627 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 21:09:12.485897 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.485821 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 21:09:12.485897 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.485867 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-w4bt5\"" Apr 22 21:09:12.485897 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.485665 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 21:09:12.486081 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.485658 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 21:09:12.486081 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.485704 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 21:09:12.486081 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.485625 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 21:09:12.486081 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.485735 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 21:09:12.571907 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.571880 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-trusted-ca\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.572066 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.571920 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-stats-auth\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:12.572066 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.571945 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjsjc\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-kube-api-access-vjsjc\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.572066 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572013 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/318e56a6-6fae-4d59-84d7-4a832ad3b200-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-l798c\" (UID: \"318e56a6-6fae-4d59-84d7-4a832ad3b200\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" Apr 22 21:09:12.572201 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572069 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bzcs\" (UniqueName: \"kubernetes.io/projected/dd3b0638-5a78-4ca8-8274-4653bf8b98d0-kube-api-access-7bzcs\") pod \"insights-operator-585dfdc468-44nf9\" (UID: \"dd3b0638-5a78-4ca8-8274-4653bf8b98d0\") " pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.572201 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572091 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smx26\" (UniqueName: \"kubernetes.io/projected/0bf09f77-1234-462f-bd11-01faf61546b9-kube-api-access-smx26\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:12.572201 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572109 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cmsv\" (UniqueName: \"kubernetes.io/projected/dcde08fd-6ed0-4845-a8b3-39ad205d8738-kube-api-access-5cmsv\") pod \"cluster-samples-operator-6dc5bdb6b4-z95n7\" (UID: \"dcde08fd-6ed0-4845-a8b3-39ad205d8738\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7" Apr 22 21:09:12.572201 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572134 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fb72504e-c4a7-4b19-a04d-f51d0fd91812-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-lt772\" (UID: \"fb72504e-c4a7-4b19-a04d-f51d0fd91812\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lt772" Apr 22 21:09:12.572201 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572153 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:12.572201 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572187 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347-serving-cert\") pod \"service-ca-operator-d6fc45fc5-b4sm6\" (UID: \"4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6" Apr 22 21:09:12.572497 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572211 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qrj4\" (UniqueName: \"kubernetes.io/projected/3812e9ff-4663-427d-8f9a-a20d66323320-kube-api-access-6qrj4\") pod \"dns-default-264s7\" (UID: \"3812e9ff-4663-427d-8f9a-a20d66323320\") " pod="openshift-dns/dns-default-264s7" Apr 22 21:09:12.572497 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572235 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd3b0638-5a78-4ca8-8274-4653bf8b98d0-service-ca-bundle\") pod \"insights-operator-585dfdc468-44nf9\" (UID: \"dd3b0638-5a78-4ca8-8274-4653bf8b98d0\") " pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.572497 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572259 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347-config\") pod \"service-ca-operator-d6fc45fc5-b4sm6\" (UID: \"4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6" Apr 22 21:09:12.572497 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572312 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-bound-sa-token\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.572497 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572352 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd3b0638-5a78-4ca8-8274-4653bf8b98d0-serving-cert\") pod \"insights-operator-585dfdc468-44nf9\" (UID: \"dd3b0638-5a78-4ca8-8274-4653bf8b98d0\") " pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.572497 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572389 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-default-certificate\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:12.572497 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572417 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls\") pod \"dns-default-264s7\" (UID: \"3812e9ff-4663-427d-8f9a-a20d66323320\") " pod="openshift-dns/dns-default-264s7" Apr 22 21:09:12.572497 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572444 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzfbk\" (UniqueName: \"kubernetes.io/projected/7e715539-0e5b-418b-a686-dfcf837def03-kube-api-access-vzfbk\") pod \"console-operator-9d4b6777b-2rdf2\" (UID: \"7e715539-0e5b-418b-a686-dfcf837def03\") " pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" Apr 22 21:09:12.572497 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572477 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7jsm\" (UniqueName: \"kubernetes.io/projected/4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347-kube-api-access-c7jsm\") pod \"service-ca-operator-d6fc45fc5-b4sm6\" (UID: \"4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6" Apr 22 21:09:12.572935 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572503 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3812e9ff-4663-427d-8f9a-a20d66323320-config-volume\") pod \"dns-default-264s7\" (UID: \"3812e9ff-4663-427d-8f9a-a20d66323320\") " pod="openshift-dns/dns-default-264s7" Apr 22 21:09:12.572935 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572531 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.572935 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572561 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/dd3b0638-5a78-4ca8-8274-4653bf8b98d0-snapshots\") pod \"insights-operator-585dfdc468-44nf9\" (UID: \"dd3b0638-5a78-4ca8-8274-4653bf8b98d0\") " pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.572935 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572587 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-image-registry-private-configuration\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.572935 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572617 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e715539-0e5b-418b-a686-dfcf837def03-serving-cert\") pod \"console-operator-9d4b6777b-2rdf2\" (UID: \"7e715539-0e5b-418b-a686-dfcf837def03\") " pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" Apr 22 21:09:12.572935 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572666 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:12.572935 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572694 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert\") pod \"ingress-canary-xbm9h\" (UID: \"dd46fc2d-0bfc-4e1b-b429-0d14ff029faf\") " pod="openshift-ingress-canary/ingress-canary-xbm9h" Apr 22 21:09:12.572935 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572729 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ksdc\" (UniqueName: \"kubernetes.io/projected/318e56a6-6fae-4d59-84d7-4a832ad3b200-kube-api-access-5ksdc\") pod \"cluster-monitoring-operator-75587bd455-l798c\" (UID: \"318e56a6-6fae-4d59-84d7-4a832ad3b200\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" Apr 22 21:09:12.572935 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572755 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-ca-trust-extracted\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.572935 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572783 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d34322a-f3f9-4e65-8714-cec36115f27e-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-567gg\" (UID: \"2d34322a-f3f9-4e65-8714-cec36115f27e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg" Apr 22 21:09:12.572935 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572808 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xfgn\" (UniqueName: \"kubernetes.io/projected/2d34322a-f3f9-4e65-8714-cec36115f27e-kube-api-access-2xfgn\") pod \"kube-storage-version-migrator-operator-6769c5d45-567gg\" (UID: \"2d34322a-f3f9-4e65-8714-cec36115f27e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg" Apr 22 21:09:12.572935 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572836 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lt772\" (UID: \"fb72504e-c4a7-4b19-a04d-f51d0fd91812\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lt772" Apr 22 21:09:12.572935 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572862 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-z95n7\" (UID: \"dcde08fd-6ed0-4845-a8b3-39ad205d8738\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7" Apr 22 21:09:12.572935 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572891 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3812e9ff-4663-427d-8f9a-a20d66323320-tmp-dir\") pod \"dns-default-264s7\" (UID: \"3812e9ff-4663-427d-8f9a-a20d66323320\") " pod="openshift-dns/dns-default-264s7" Apr 22 21:09:12.572935 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572918 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd3b0638-5a78-4ca8-8274-4653bf8b98d0-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-44nf9\" (UID: \"dd3b0638-5a78-4ca8-8274-4653bf8b98d0\") " pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.573502 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572945 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dd3b0638-5a78-4ca8-8274-4653bf8b98d0-tmp\") pod \"insights-operator-585dfdc468-44nf9\" (UID: \"dd3b0638-5a78-4ca8-8274-4653bf8b98d0\") " pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.573502 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572967 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e715539-0e5b-418b-a686-dfcf837def03-config\") pod \"console-operator-9d4b6777b-2rdf2\" (UID: \"7e715539-0e5b-418b-a686-dfcf837def03\") " pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" Apr 22 21:09:12.573502 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.572992 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d34322a-f3f9-4e65-8714-cec36115f27e-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-567gg\" (UID: \"2d34322a-f3f9-4e65-8714-cec36115f27e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg" Apr 22 21:09:12.573502 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.573016 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9wbn\" (UniqueName: \"kubernetes.io/projected/35136bae-7913-4fe1-9aa1-51925e06b1c6-kube-api-access-f9wbn\") pod \"volume-data-source-validator-7c6cbb6c87-98ckw\" (UID: \"35136bae-7913-4fe1-9aa1-51925e06b1c6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-98ckw" Apr 22 21:09:12.573502 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.573044 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h75x\" (UniqueName: \"kubernetes.io/projected/84ec7f24-1a5d-4fbc-98f4-38f3250c761d-kube-api-access-5h75x\") pod \"network-check-source-8894fc9bd-hqxdd\" (UID: \"84ec7f24-1a5d-4fbc-98f4-38f3250c761d\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hqxdd" Apr 22 21:09:12.573502 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.573079 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-installation-pull-secrets\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.573502 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.573103 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7e715539-0e5b-418b-a686-dfcf837def03-trusted-ca\") pod \"console-operator-9d4b6777b-2rdf2\" (UID: \"7e715539-0e5b-418b-a686-dfcf837def03\") " pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" Apr 22 21:09:12.573502 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.573127 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-l798c\" (UID: \"318e56a6-6fae-4d59-84d7-4a832ad3b200\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" Apr 22 21:09:12.573502 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.573157 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfc4v\" (UniqueName: \"kubernetes.io/projected/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-kube-api-access-mfc4v\") pod \"ingress-canary-xbm9h\" (UID: \"dd46fc2d-0bfc-4e1b-b429-0d14ff029faf\") " pod="openshift-ingress-canary/ingress-canary-xbm9h" Apr 22 21:09:12.573502 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.573183 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-certificates\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.674280 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674194 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert\") pod \"ingress-canary-xbm9h\" (UID: \"dd46fc2d-0bfc-4e1b-b429-0d14ff029faf\") " pod="openshift-ingress-canary/ingress-canary-xbm9h" Apr 22 21:09:12.674280 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674247 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ksdc\" (UniqueName: \"kubernetes.io/projected/318e56a6-6fae-4d59-84d7-4a832ad3b200-kube-api-access-5ksdc\") pod \"cluster-monitoring-operator-75587bd455-l798c\" (UID: \"318e56a6-6fae-4d59-84d7-4a832ad3b200\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" Apr 22 21:09:12.674487 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674277 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-ca-trust-extracted\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.674487 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:12.674364 2546 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:12.674487 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674392 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d34322a-f3f9-4e65-8714-cec36115f27e-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-567gg\" (UID: \"2d34322a-f3f9-4e65-8714-cec36115f27e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg" Apr 22 21:09:12.674487 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:12.674440 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert podName:dd46fc2d-0bfc-4e1b-b429-0d14ff029faf nodeName:}" failed. No retries permitted until 2026-04-22 21:09:13.174420349 +0000 UTC m=+34.083721050 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert") pod "ingress-canary-xbm9h" (UID: "dd46fc2d-0bfc-4e1b-b429-0d14ff029faf") : secret "canary-serving-cert" not found Apr 22 21:09:12.674487 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674464 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xfgn\" (UniqueName: \"kubernetes.io/projected/2d34322a-f3f9-4e65-8714-cec36115f27e-kube-api-access-2xfgn\") pod \"kube-storage-version-migrator-operator-6769c5d45-567gg\" (UID: \"2d34322a-f3f9-4e65-8714-cec36115f27e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg" Apr 22 21:09:12.674798 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674495 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lt772\" (UID: \"fb72504e-c4a7-4b19-a04d-f51d0fd91812\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lt772" Apr 22 21:09:12.674798 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674521 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-z95n7\" (UID: \"dcde08fd-6ed0-4845-a8b3-39ad205d8738\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7" Apr 22 21:09:12.674798 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674549 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3812e9ff-4663-427d-8f9a-a20d66323320-tmp-dir\") pod \"dns-default-264s7\" (UID: \"3812e9ff-4663-427d-8f9a-a20d66323320\") " pod="openshift-dns/dns-default-264s7" Apr 22 21:09:12.674798 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674575 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd3b0638-5a78-4ca8-8274-4653bf8b98d0-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-44nf9\" (UID: \"dd3b0638-5a78-4ca8-8274-4653bf8b98d0\") " pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.674798 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674603 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dd3b0638-5a78-4ca8-8274-4653bf8b98d0-tmp\") pod \"insights-operator-585dfdc468-44nf9\" (UID: \"dd3b0638-5a78-4ca8-8274-4653bf8b98d0\") " pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.674798 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674628 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e715539-0e5b-418b-a686-dfcf837def03-config\") pod \"console-operator-9d4b6777b-2rdf2\" (UID: \"7e715539-0e5b-418b-a686-dfcf837def03\") " pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" Apr 22 21:09:12.674798 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674672 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d34322a-f3f9-4e65-8714-cec36115f27e-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-567gg\" (UID: \"2d34322a-f3f9-4e65-8714-cec36115f27e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg" Apr 22 21:09:12.674798 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:12.674698 2546 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 21:09:12.674798 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674702 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9wbn\" (UniqueName: \"kubernetes.io/projected/35136bae-7913-4fe1-9aa1-51925e06b1c6-kube-api-access-f9wbn\") pod \"volume-data-source-validator-7c6cbb6c87-98ckw\" (UID: \"35136bae-7913-4fe1-9aa1-51925e06b1c6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-98ckw" Apr 22 21:09:12.674798 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674726 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-ca-trust-extracted\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.674798 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:12.674765 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls podName:dcde08fd-6ed0-4845-a8b3-39ad205d8738 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:13.174747335 +0000 UTC m=+34.084048030 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-z95n7" (UID: "dcde08fd-6ed0-4845-a8b3-39ad205d8738") : secret "samples-operator-tls" not found Apr 22 21:09:12.674798 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674729 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h75x\" (UniqueName: \"kubernetes.io/projected/84ec7f24-1a5d-4fbc-98f4-38f3250c761d-kube-api-access-5h75x\") pod \"network-check-source-8894fc9bd-hqxdd\" (UID: \"84ec7f24-1a5d-4fbc-98f4-38f3250c761d\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hqxdd" Apr 22 21:09:12.675413 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674814 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-installation-pull-secrets\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.675413 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674868 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7e715539-0e5b-418b-a686-dfcf837def03-trusted-ca\") pod \"console-operator-9d4b6777b-2rdf2\" (UID: \"7e715539-0e5b-418b-a686-dfcf837def03\") " pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" Apr 22 21:09:12.675413 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674896 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-l798c\" (UID: \"318e56a6-6fae-4d59-84d7-4a832ad3b200\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" Apr 22 21:09:12.675413 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.674968 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfc4v\" (UniqueName: \"kubernetes.io/projected/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-kube-api-access-mfc4v\") pod \"ingress-canary-xbm9h\" (UID: \"dd46fc2d-0bfc-4e1b-b429-0d14ff029faf\") " pod="openshift-ingress-canary/ingress-canary-xbm9h" Apr 22 21:09:12.675413 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675007 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3812e9ff-4663-427d-8f9a-a20d66323320-tmp-dir\") pod \"dns-default-264s7\" (UID: \"3812e9ff-4663-427d-8f9a-a20d66323320\") " pod="openshift-dns/dns-default-264s7" Apr 22 21:09:12.675413 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675020 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-certificates\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.675413 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675029 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d34322a-f3f9-4e65-8714-cec36115f27e-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-567gg\" (UID: \"2d34322a-f3f9-4e65-8714-cec36115f27e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg" Apr 22 21:09:12.675413 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:12.675040 2546 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 21:09:12.675413 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675076 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-trusted-ca\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.675413 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:12.675081 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert podName:fb72504e-c4a7-4b19-a04d-f51d0fd91812 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:13.175066494 +0000 UTC m=+34.084367183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lt772" (UID: "fb72504e-c4a7-4b19-a04d-f51d0fd91812") : secret "networking-console-plugin-cert" not found Apr 22 21:09:12.675413 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675092 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dd3b0638-5a78-4ca8-8274-4653bf8b98d0-tmp\") pod \"insights-operator-585dfdc468-44nf9\" (UID: \"dd3b0638-5a78-4ca8-8274-4653bf8b98d0\") " pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.675413 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675124 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-stats-auth\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:12.675413 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675163 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjsjc\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-kube-api-access-vjsjc\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.675413 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675207 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/318e56a6-6fae-4d59-84d7-4a832ad3b200-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-l798c\" (UID: \"318e56a6-6fae-4d59-84d7-4a832ad3b200\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" Apr 22 21:09:12.675413 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675248 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bzcs\" (UniqueName: \"kubernetes.io/projected/dd3b0638-5a78-4ca8-8274-4653bf8b98d0-kube-api-access-7bzcs\") pod \"insights-operator-585dfdc468-44nf9\" (UID: \"dd3b0638-5a78-4ca8-8274-4653bf8b98d0\") " pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.675413 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675276 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smx26\" (UniqueName: \"kubernetes.io/projected/0bf09f77-1234-462f-bd11-01faf61546b9-kube-api-access-smx26\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:12.676123 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675303 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cmsv\" (UniqueName: \"kubernetes.io/projected/dcde08fd-6ed0-4845-a8b3-39ad205d8738-kube-api-access-5cmsv\") pod \"cluster-samples-operator-6dc5bdb6b4-z95n7\" (UID: \"dcde08fd-6ed0-4845-a8b3-39ad205d8738\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7" Apr 22 21:09:12.676123 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675342 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fb72504e-c4a7-4b19-a04d-f51d0fd91812-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-lt772\" (UID: \"fb72504e-c4a7-4b19-a04d-f51d0fd91812\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lt772" Apr 22 21:09:12.676123 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675368 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:12.676123 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675414 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347-serving-cert\") pod \"service-ca-operator-d6fc45fc5-b4sm6\" (UID: \"4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6" Apr 22 21:09:12.676123 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675443 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6qrj4\" (UniqueName: \"kubernetes.io/projected/3812e9ff-4663-427d-8f9a-a20d66323320-kube-api-access-6qrj4\") pod \"dns-default-264s7\" (UID: \"3812e9ff-4663-427d-8f9a-a20d66323320\") " pod="openshift-dns/dns-default-264s7" Apr 22 21:09:12.676123 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675467 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd3b0638-5a78-4ca8-8274-4653bf8b98d0-service-ca-bundle\") pod \"insights-operator-585dfdc468-44nf9\" (UID: \"dd3b0638-5a78-4ca8-8274-4653bf8b98d0\") " pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.676123 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675507 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347-config\") pod \"service-ca-operator-d6fc45fc5-b4sm6\" (UID: \"4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6" Apr 22 21:09:12.676123 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675532 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-bound-sa-token\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.676123 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675561 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd3b0638-5a78-4ca8-8274-4653bf8b98d0-serving-cert\") pod \"insights-operator-585dfdc468-44nf9\" (UID: \"dd3b0638-5a78-4ca8-8274-4653bf8b98d0\") " pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.676123 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675585 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-default-certificate\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:12.676123 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675613 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls\") pod \"dns-default-264s7\" (UID: \"3812e9ff-4663-427d-8f9a-a20d66323320\") " pod="openshift-dns/dns-default-264s7" Apr 22 21:09:12.676123 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675639 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzfbk\" (UniqueName: \"kubernetes.io/projected/7e715539-0e5b-418b-a686-dfcf837def03-kube-api-access-vzfbk\") pod \"console-operator-9d4b6777b-2rdf2\" (UID: \"7e715539-0e5b-418b-a686-dfcf837def03\") " pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" Apr 22 21:09:12.676123 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675690 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7jsm\" (UniqueName: \"kubernetes.io/projected/4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347-kube-api-access-c7jsm\") pod \"service-ca-operator-d6fc45fc5-b4sm6\" (UID: \"4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6" Apr 22 21:09:12.676123 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675718 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3812e9ff-4663-427d-8f9a-a20d66323320-config-volume\") pod \"dns-default-264s7\" (UID: \"3812e9ff-4663-427d-8f9a-a20d66323320\") " pod="openshift-dns/dns-default-264s7" Apr 22 21:09:12.676123 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675747 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.676123 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675762 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e715539-0e5b-418b-a686-dfcf837def03-config\") pod \"console-operator-9d4b6777b-2rdf2\" (UID: \"7e715539-0e5b-418b-a686-dfcf837def03\") " pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" Apr 22 21:09:12.676769 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675777 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/dd3b0638-5a78-4ca8-8274-4653bf8b98d0-snapshots\") pod \"insights-operator-585dfdc468-44nf9\" (UID: \"dd3b0638-5a78-4ca8-8274-4653bf8b98d0\") " pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.676769 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675831 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd3b0638-5a78-4ca8-8274-4653bf8b98d0-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-44nf9\" (UID: \"dd3b0638-5a78-4ca8-8274-4653bf8b98d0\") " pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.676769 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675837 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-image-registry-private-configuration\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.676769 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675889 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e715539-0e5b-418b-a686-dfcf837def03-serving-cert\") pod \"console-operator-9d4b6777b-2rdf2\" (UID: \"7e715539-0e5b-418b-a686-dfcf837def03\") " pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" Apr 22 21:09:12.676769 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675930 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:12.676769 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.675966 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:09:12.676769 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:12.676064 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle podName:0bf09f77-1234-462f-bd11-01faf61546b9 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:13.176050203 +0000 UTC m=+34.085350913 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle") pod "router-default-569c57978d-cwfqs" (UID: "0bf09f77-1234-462f-bd11-01faf61546b9") : configmap references non-existent config key: service-ca.crt Apr 22 21:09:12.676769 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.676299 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/dd3b0638-5a78-4ca8-8274-4653bf8b98d0-snapshots\") pod \"insights-operator-585dfdc468-44nf9\" (UID: \"dd3b0638-5a78-4ca8-8274-4653bf8b98d0\") " pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.676769 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.676446 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:09:12.676769 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.676568 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-trusted-ca\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.676769 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.676583 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347-config\") pod \"service-ca-operator-d6fc45fc5-b4sm6\" (UID: \"4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6" Apr 22 21:09:12.676769 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:12.676681 2546 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 21:09:12.676769 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:12.676723 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls podName:318e56a6-6fae-4d59-84d7-4a832ad3b200 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:13.176709712 +0000 UTC m=+34.086010408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-l798c" (UID: "318e56a6-6fae-4d59-84d7-4a832ad3b200") : secret "cluster-monitoring-operator-tls" not found Apr 22 21:09:12.677589 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.676776 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd3b0638-5a78-4ca8-8274-4653bf8b98d0-service-ca-bundle\") pod \"insights-operator-585dfdc468-44nf9\" (UID: \"dd3b0638-5a78-4ca8-8274-4653bf8b98d0\") " pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.677589 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:12.677055 2546 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:12.677589 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:12.677108 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls podName:3812e9ff-4663-427d-8f9a-a20d66323320 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:13.177092656 +0000 UTC m=+34.086393352 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls") pod "dns-default-264s7" (UID: "3812e9ff-4663-427d-8f9a-a20d66323320") : secret "dns-default-metrics-tls" not found Apr 22 21:09:12.677589 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.677214 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:09:12.677901 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.677876 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7e715539-0e5b-418b-a686-dfcf837def03-trusted-ca\") pod \"console-operator-9d4b6777b-2rdf2\" (UID: \"7e715539-0e5b-418b-a686-dfcf837def03\") " pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" Apr 22 21:09:12.677962 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.677947 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3812e9ff-4663-427d-8f9a-a20d66323320-config-volume\") pod \"dns-default-264s7\" (UID: \"3812e9ff-4663-427d-8f9a-a20d66323320\") " pod="openshift-dns/dns-default-264s7" Apr 22 21:09:12.679029 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:12.678195 2546 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:09:12.679029 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:12.678215 2546 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8576cd8d8-tvm8d: secret "image-registry-tls" not found Apr 22 21:09:12.679029 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:12.678261 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls podName:aae8c028-d54c-4e48-8065-01b3b3bbc0b0 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:13.178245958 +0000 UTC m=+34.087546646 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls") pod "image-registry-8576cd8d8-tvm8d" (UID: "aae8c028-d54c-4e48-8065-01b3b3bbc0b0") : secret "image-registry-tls" not found Apr 22 21:09:12.679029 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:12.678334 2546 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 21:09:12.679029 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:12.678370 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs podName:0bf09f77-1234-462f-bd11-01faf61546b9 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:13.178358464 +0000 UTC m=+34.087659154 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs") pod "router-default-569c57978d-cwfqs" (UID: "0bf09f77-1234-462f-bd11-01faf61546b9") : secret "router-metrics-certs-default" not found Apr 22 21:09:12.679370 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.679101 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-certificates\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.680342 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.680321 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fb72504e-c4a7-4b19-a04d-f51d0fd91812-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-lt772\" (UID: \"fb72504e-c4a7-4b19-a04d-f51d0fd91812\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lt772" Apr 22 21:09:12.681108 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.680434 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/318e56a6-6fae-4d59-84d7-4a832ad3b200-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-l798c\" (UID: \"318e56a6-6fae-4d59-84d7-4a832ad3b200\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" Apr 22 21:09:12.681108 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.680903 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-default-certificate\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:12.681108 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.680947 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 21:09:12.681108 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.681003 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e715539-0e5b-418b-a686-dfcf837def03-serving-cert\") pod \"console-operator-9d4b6777b-2rdf2\" (UID: \"7e715539-0e5b-418b-a686-dfcf837def03\") " pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" Apr 22 21:09:12.681108 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.681053 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 21:09:12.681777 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.681363 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d34322a-f3f9-4e65-8714-cec36115f27e-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-567gg\" (UID: \"2d34322a-f3f9-4e65-8714-cec36115f27e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg" Apr 22 21:09:12.681777 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.681599 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ccfht\"" Apr 22 21:09:12.681777 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.681625 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-stats-auth\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:12.681991 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.681792 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bm97r\"" Apr 22 21:09:12.681991 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.681947 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd3b0638-5a78-4ca8-8274-4653bf8b98d0-serving-cert\") pod \"insights-operator-585dfdc468-44nf9\" (UID: \"dd3b0638-5a78-4ca8-8274-4653bf8b98d0\") " pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.681991 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.681963 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-image-registry-private-configuration\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.682600 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.682570 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-installation-pull-secrets\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.684250 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.683973 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xfgn\" (UniqueName: \"kubernetes.io/projected/2d34322a-f3f9-4e65-8714-cec36115f27e-kube-api-access-2xfgn\") pod \"kube-storage-version-migrator-operator-6769c5d45-567gg\" (UID: \"2d34322a-f3f9-4e65-8714-cec36115f27e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg" Apr 22 21:09:12.684668 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.684619 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347-serving-cert\") pod \"service-ca-operator-d6fc45fc5-b4sm6\" (UID: \"4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6" Apr 22 21:09:12.686954 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.686928 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ksdc\" (UniqueName: \"kubernetes.io/projected/318e56a6-6fae-4d59-84d7-4a832ad3b200-kube-api-access-5ksdc\") pod \"cluster-monitoring-operator-75587bd455-l798c\" (UID: \"318e56a6-6fae-4d59-84d7-4a832ad3b200\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" Apr 22 21:09:12.687743 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.687563 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-bound-sa-token\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.688989 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.688400 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzfbk\" (UniqueName: \"kubernetes.io/projected/7e715539-0e5b-418b-a686-dfcf837def03-kube-api-access-vzfbk\") pod \"console-operator-9d4b6777b-2rdf2\" (UID: \"7e715539-0e5b-418b-a686-dfcf837def03\") " pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" Apr 22 21:09:12.691469 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.689578 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qrj4\" (UniqueName: \"kubernetes.io/projected/3812e9ff-4663-427d-8f9a-a20d66323320-kube-api-access-6qrj4\") pod \"dns-default-264s7\" (UID: \"3812e9ff-4663-427d-8f9a-a20d66323320\") " pod="openshift-dns/dns-default-264s7" Apr 22 21:09:12.691469 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.689704 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h75x\" (UniqueName: \"kubernetes.io/projected/84ec7f24-1a5d-4fbc-98f4-38f3250c761d-kube-api-access-5h75x\") pod \"network-check-source-8894fc9bd-hqxdd\" (UID: \"84ec7f24-1a5d-4fbc-98f4-38f3250c761d\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hqxdd" Apr 22 21:09:12.691469 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.690083 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9wbn\" (UniqueName: \"kubernetes.io/projected/35136bae-7913-4fe1-9aa1-51925e06b1c6-kube-api-access-f9wbn\") pod \"volume-data-source-validator-7c6cbb6c87-98ckw\" (UID: \"35136bae-7913-4fe1-9aa1-51925e06b1c6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-98ckw" Apr 22 21:09:12.691469 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.690333 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfc4v\" (UniqueName: \"kubernetes.io/projected/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-kube-api-access-mfc4v\") pod \"ingress-canary-xbm9h\" (UID: \"dd46fc2d-0bfc-4e1b-b429-0d14ff029faf\") " pod="openshift-ingress-canary/ingress-canary-xbm9h" Apr 22 21:09:12.691469 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.690859 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjsjc\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-kube-api-access-vjsjc\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:12.691985 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.691962 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smx26\" (UniqueName: \"kubernetes.io/projected/0bf09f77-1234-462f-bd11-01faf61546b9-kube-api-access-smx26\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:12.692528 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.692504 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cmsv\" (UniqueName: \"kubernetes.io/projected/dcde08fd-6ed0-4845-a8b3-39ad205d8738-kube-api-access-5cmsv\") pod \"cluster-samples-operator-6dc5bdb6b4-z95n7\" (UID: \"dcde08fd-6ed0-4845-a8b3-39ad205d8738\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7" Apr 22 21:09:12.693730 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.693707 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7jsm\" (UniqueName: \"kubernetes.io/projected/4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347-kube-api-access-c7jsm\") pod \"service-ca-operator-d6fc45fc5-b4sm6\" (UID: \"4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6" Apr 22 21:09:12.694051 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.694028 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bzcs\" (UniqueName: \"kubernetes.io/projected/dd3b0638-5a78-4ca8-8274-4653bf8b98d0-kube-api-access-7bzcs\") pod \"insights-operator-585dfdc468-44nf9\" (UID: \"dd3b0638-5a78-4ca8-8274-4653bf8b98d0\") " pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.751660 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.751609 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6" Apr 22 21:09:12.781978 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.781720 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-98ckw" Apr 22 21:09:12.790223 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.790196 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg" Apr 22 21:09:12.805173 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.805142 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" Apr 22 21:09:12.821443 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.821420 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-44nf9" Apr 22 21:09:12.841374 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:12.841347 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hqxdd" Apr 22 21:09:13.181083 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.181045 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:13.181270 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.181091 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert\") pod \"ingress-canary-xbm9h\" (UID: \"dd46fc2d-0bfc-4e1b-b429-0d14ff029faf\") " pod="openshift-ingress-canary/ingress-canary-xbm9h" Apr 22 21:09:13.181270 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.181115 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lt772\" (UID: \"fb72504e-c4a7-4b19-a04d-f51d0fd91812\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lt772" Apr 22 21:09:13.181270 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.181137 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-z95n7\" (UID: \"dcde08fd-6ed0-4845-a8b3-39ad205d8738\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7" Apr 22 21:09:13.181270 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.181177 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-l798c\" (UID: \"318e56a6-6fae-4d59-84d7-4a832ad3b200\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" Apr 22 21:09:13.181270 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:13.181227 2546 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:13.181481 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:13.181226 2546 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 21:09:13.181481 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:13.181286 2546 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 21:09:13.181481 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:13.181292 2546 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 21:09:13.181481 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:13.181239 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle podName:0bf09f77-1234-462f-bd11-01faf61546b9 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:14.181220226 +0000 UTC m=+35.090520934 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle") pod "router-default-569c57978d-cwfqs" (UID: "0bf09f77-1234-462f-bd11-01faf61546b9") : configmap references non-existent config key: service-ca.crt Apr 22 21:09:13.181481 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.181358 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:13.181481 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.181422 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls\") pod \"dns-default-264s7\" (UID: \"3812e9ff-4663-427d-8f9a-a20d66323320\") " pod="openshift-dns/dns-default-264s7" Apr 22 21:09:13.181481 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:13.181426 2546 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 21:09:13.181481 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:13.181450 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert podName:dd46fc2d-0bfc-4e1b-b429-0d14ff029faf nodeName:}" failed. No retries permitted until 2026-04-22 21:09:14.181434107 +0000 UTC m=+35.090734803 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert") pod "ingress-canary-xbm9h" (UID: "dd46fc2d-0bfc-4e1b-b429-0d14ff029faf") : secret "canary-serving-cert" not found Apr 22 21:09:13.181481 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:13.181476 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert podName:fb72504e-c4a7-4b19-a04d-f51d0fd91812 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:14.181465121 +0000 UTC m=+35.090765812 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lt772" (UID: "fb72504e-c4a7-4b19-a04d-f51d0fd91812") : secret "networking-console-plugin-cert" not found Apr 22 21:09:13.181481 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:13.181476 2546 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:13.181850 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:13.181501 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls podName:318e56a6-6fae-4d59-84d7-4a832ad3b200 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:14.181492534 +0000 UTC m=+35.090793223 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-l798c" (UID: "318e56a6-6fae-4d59-84d7-4a832ad3b200") : secret "cluster-monitoring-operator-tls" not found Apr 22 21:09:13.181850 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:13.181521 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls podName:dcde08fd-6ed0-4845-a8b3-39ad205d8738 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:14.181513586 +0000 UTC m=+35.090814283 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-z95n7" (UID: "dcde08fd-6ed0-4845-a8b3-39ad205d8738") : secret "samples-operator-tls" not found Apr 22 21:09:13.181850 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:13.181537 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs podName:0bf09f77-1234-462f-bd11-01faf61546b9 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:14.181529387 +0000 UTC m=+35.090830080 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs") pod "router-default-569c57978d-cwfqs" (UID: "0bf09f77-1234-462f-bd11-01faf61546b9") : secret "router-metrics-certs-default" not found Apr 22 21:09:13.181850 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:13.181553 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls podName:3812e9ff-4663-427d-8f9a-a20d66323320 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:14.181546205 +0000 UTC m=+35.090846895 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls") pod "dns-default-264s7" (UID: "3812e9ff-4663-427d-8f9a-a20d66323320") : secret "dns-default-metrics-tls" not found Apr 22 21:09:13.181850 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.181590 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:13.181850 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:13.181699 2546 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:09:13.181850 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:13.181709 2546 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8576cd8d8-tvm8d: secret "image-registry-tls" not found Apr 22 21:09:13.181850 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:13.181737 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls podName:aae8c028-d54c-4e48-8065-01b3b3bbc0b0 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:14.181730019 +0000 UTC m=+35.091030711 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls") pod "image-registry-8576cd8d8-tvm8d" (UID: "aae8c028-d54c-4e48-8065-01b3b3bbc0b0") : secret "image-registry-tls" not found Apr 22 21:09:13.383987 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.383814 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs\") pod \"network-metrics-daemon-4476p\" (UID: \"046a767f-6fe0-4c57-ba82-6a2ce75f7864\") " pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:09:13.383987 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:13.383963 2546 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 21:09:13.384366 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:13.384260 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs podName:046a767f-6fe0-4c57-ba82-6a2ce75f7864 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:45.384229506 +0000 UTC m=+66.293530209 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs") pod "network-metrics-daemon-4476p" (UID: "046a767f-6fe0-4c57-ba82-6a2ce75f7864") : secret "metrics-daemon-secret" not found Apr 22 21:09:13.587924 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.587321 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x98kt\" (UniqueName: \"kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt\") pod \"network-check-target-hm4qp\" (UID: \"dc5617e8-2aaf-4671-9fab-db6988f9b8b4\") " pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:09:13.591738 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.591715 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x98kt\" (UniqueName: \"kubernetes.io/projected/dc5617e8-2aaf-4671-9fab-db6988f9b8b4-kube-api-access-x98kt\") pod \"network-check-target-hm4qp\" (UID: \"dc5617e8-2aaf-4671-9fab-db6988f9b8b4\") " pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:09:13.598011 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.597982 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-hqxdd"] Apr 22 21:09:13.602097 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:13.602063 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84ec7f24_1a5d_4fbc_98f4_38f3250c761d.slice/crio-104a0883d6351e02d8541ee822b14f28e63f9464fed90c3edda5ae628d5a3a46 WatchSource:0}: Error finding container 104a0883d6351e02d8541ee822b14f28e63f9464fed90c3edda5ae628d5a3a46: Status 404 returned error can't find the container with id 104a0883d6351e02d8541ee822b14f28e63f9464fed90c3edda5ae628d5a3a46 Apr 22 21:09:13.604109 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.604061 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6"] Apr 22 21:09:13.609152 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:13.609130 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d6941f8_9b27_4ee7_b48d_7b1ba8e5a347.slice/crio-5293444bbf231367f383a88ccee25e01e16619e212f8cdf833a19d3a356004e7 WatchSource:0}: Error finding container 5293444bbf231367f383a88ccee25e01e16619e212f8cdf833a19d3a356004e7: Status 404 returned error can't find the container with id 5293444bbf231367f383a88ccee25e01e16619e212f8cdf833a19d3a356004e7 Apr 22 21:09:13.615736 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.615714 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-98ckw"] Apr 22 21:09:13.616821 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.616762 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg"] Apr 22 21:09:13.618336 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.618316 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-44nf9"] Apr 22 21:09:13.619977 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:13.619948 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35136bae_7913_4fe1_9aa1_51925e06b1c6.slice/crio-7c1823b2e6910488ecd144d6f6daae6b8cf1ec79bd458a999391260e2a77c060 WatchSource:0}: Error finding container 7c1823b2e6910488ecd144d6f6daae6b8cf1ec79bd458a999391260e2a77c060: Status 404 returned error can't find the container with id 7c1823b2e6910488ecd144d6f6daae6b8cf1ec79bd458a999391260e2a77c060 Apr 22 21:09:13.620624 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.620548 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:09:13.621313 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.621288 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2rdf2"] Apr 22 21:09:13.621990 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:13.621866 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d34322a_f3f9_4e65_8714_cec36115f27e.slice/crio-369bede46a20aa79ef1b228a962359de352d15ffb4fcba2b080f24c68b22b1fd WatchSource:0}: Error finding container 369bede46a20aa79ef1b228a962359de352d15ffb4fcba2b080f24c68b22b1fd: Status 404 returned error can't find the container with id 369bede46a20aa79ef1b228a962359de352d15ffb4fcba2b080f24c68b22b1fd Apr 22 21:09:13.622414 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:13.622392 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd3b0638_5a78_4ca8_8274_4653bf8b98d0.slice/crio-4b75d733d0d9b58ab034c0c2d4e23855962cb54a8263491f7166cd0d70a538e2 WatchSource:0}: Error finding container 4b75d733d0d9b58ab034c0c2d4e23855962cb54a8263491f7166cd0d70a538e2: Status 404 returned error can't find the container with id 4b75d733d0d9b58ab034c0c2d4e23855962cb54a8263491f7166cd0d70a538e2 Apr 22 21:09:13.629361 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:13.629338 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e715539_0e5b_418b_a686_dfcf837def03.slice/crio-6dddb0bdbb096afa10731cdda539f1c57fa12261a16103381b9c2ad711b1789a WatchSource:0}: Error finding container 6dddb0bdbb096afa10731cdda539f1c57fa12261a16103381b9c2ad711b1789a: Status 404 returned error can't find the container with id 6dddb0bdbb096afa10731cdda539f1c57fa12261a16103381b9c2ad711b1789a Apr 22 21:09:13.762710 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.762637 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hm4qp"] Apr 22 21:09:13.766067 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:13.766034 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc5617e8_2aaf_4671_9fab_db6988f9b8b4.slice/crio-af612e4b88d16b9bbae1f29f3e3c74185969dbaf0a3eef23534c9456aab70f6b WatchSource:0}: Error finding container af612e4b88d16b9bbae1f29f3e3c74185969dbaf0a3eef23534c9456aab70f6b: Status 404 returned error can't find the container with id af612e4b88d16b9bbae1f29f3e3c74185969dbaf0a3eef23534c9456aab70f6b Apr 22 21:09:13.820285 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.820237 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hm4qp" event={"ID":"dc5617e8-2aaf-4671-9fab-db6988f9b8b4","Type":"ContainerStarted","Data":"af612e4b88d16b9bbae1f29f3e3c74185969dbaf0a3eef23534c9456aab70f6b"} Apr 22 21:09:13.821282 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.821260 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-98ckw" event={"ID":"35136bae-7913-4fe1-9aa1-51925e06b1c6","Type":"ContainerStarted","Data":"7c1823b2e6910488ecd144d6f6daae6b8cf1ec79bd458a999391260e2a77c060"} Apr 22 21:09:13.823991 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.823966 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5b9n" event={"ID":"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63","Type":"ContainerStarted","Data":"040542f49edf911b3dfe4f56e4d3eeabec3840a5ff35c53ff923aac6bdefeaa6"} Apr 22 21:09:13.825072 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.825049 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg" event={"ID":"2d34322a-f3f9-4e65-8714-cec36115f27e","Type":"ContainerStarted","Data":"369bede46a20aa79ef1b228a962359de352d15ffb4fcba2b080f24c68b22b1fd"} Apr 22 21:09:13.826017 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.825985 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" event={"ID":"7e715539-0e5b-418b-a686-dfcf837def03","Type":"ContainerStarted","Data":"6dddb0bdbb096afa10731cdda539f1c57fa12261a16103381b9c2ad711b1789a"} Apr 22 21:09:13.826995 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.826973 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6" event={"ID":"4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347","Type":"ContainerStarted","Data":"5293444bbf231367f383a88ccee25e01e16619e212f8cdf833a19d3a356004e7"} Apr 22 21:09:13.827949 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.827927 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-44nf9" event={"ID":"dd3b0638-5a78-4ca8-8274-4653bf8b98d0","Type":"ContainerStarted","Data":"4b75d733d0d9b58ab034c0c2d4e23855962cb54a8263491f7166cd0d70a538e2"} Apr 22 21:09:13.828925 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:13.828889 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hqxdd" event={"ID":"84ec7f24-1a5d-4fbc-98f4-38f3250c761d","Type":"ContainerStarted","Data":"104a0883d6351e02d8541ee822b14f28e63f9464fed90c3edda5ae628d5a3a46"} Apr 22 21:09:14.194620 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:14.194585 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:14.194822 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:14.194699 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls\") pod \"dns-default-264s7\" (UID: \"3812e9ff-4663-427d-8f9a-a20d66323320\") " pod="openshift-dns/dns-default-264s7" Apr 22 21:09:14.194822 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:14.194751 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:14.194822 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:14.194796 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:14.194983 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:14.194827 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert\") pod \"ingress-canary-xbm9h\" (UID: \"dd46fc2d-0bfc-4e1b-b429-0d14ff029faf\") " pod="openshift-ingress-canary/ingress-canary-xbm9h" Apr 22 21:09:14.194983 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:14.194862 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lt772\" (UID: \"fb72504e-c4a7-4b19-a04d-f51d0fd91812\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lt772" Apr 22 21:09:14.194983 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:14.194892 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-z95n7\" (UID: \"dcde08fd-6ed0-4845-a8b3-39ad205d8738\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7" Apr 22 21:09:14.194983 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:14.194926 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-l798c\" (UID: \"318e56a6-6fae-4d59-84d7-4a832ad3b200\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" Apr 22 21:09:14.195176 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:14.195050 2546 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 21:09:14.195176 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:14.195106 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls podName:318e56a6-6fae-4d59-84d7-4a832ad3b200 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:16.195092466 +0000 UTC m=+37.104393171 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-l798c" (UID: "318e56a6-6fae-4d59-84d7-4a832ad3b200") : secret "cluster-monitoring-operator-tls" not found Apr 22 21:09:14.195176 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:14.195147 2546 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 21:09:14.195341 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:14.195179 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs podName:0bf09f77-1234-462f-bd11-01faf61546b9 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:16.19517314 +0000 UTC m=+37.104473829 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs") pod "router-default-569c57978d-cwfqs" (UID: "0bf09f77-1234-462f-bd11-01faf61546b9") : secret "router-metrics-certs-default" not found Apr 22 21:09:14.195341 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:14.195214 2546 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:14.195341 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:14.195232 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls podName:3812e9ff-4663-427d-8f9a-a20d66323320 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:16.195226126 +0000 UTC m=+37.104526815 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls") pod "dns-default-264s7" (UID: "3812e9ff-4663-427d-8f9a-a20d66323320") : secret "dns-default-metrics-tls" not found Apr 22 21:09:14.195341 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:14.195279 2546 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:09:14.195341 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:14.195287 2546 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8576cd8d8-tvm8d: secret "image-registry-tls" not found Apr 22 21:09:14.195341 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:14.195317 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls podName:aae8c028-d54c-4e48-8065-01b3b3bbc0b0 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:16.195299083 +0000 UTC m=+37.104599772 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls") pod "image-registry-8576cd8d8-tvm8d" (UID: "aae8c028-d54c-4e48-8065-01b3b3bbc0b0") : secret "image-registry-tls" not found Apr 22 21:09:14.195635 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:14.195359 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle podName:0bf09f77-1234-462f-bd11-01faf61546b9 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:16.195353652 +0000 UTC m=+37.104654340 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle") pod "router-default-569c57978d-cwfqs" (UID: "0bf09f77-1234-462f-bd11-01faf61546b9") : configmap references non-existent config key: service-ca.crt Apr 22 21:09:14.195635 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:14.195399 2546 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:14.195635 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:14.195416 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert podName:dd46fc2d-0bfc-4e1b-b429-0d14ff029faf nodeName:}" failed. No retries permitted until 2026-04-22 21:09:16.195410911 +0000 UTC m=+37.104711600 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert") pod "ingress-canary-xbm9h" (UID: "dd46fc2d-0bfc-4e1b-b429-0d14ff029faf") : secret "canary-serving-cert" not found Apr 22 21:09:14.195635 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:14.195451 2546 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 21:09:14.195635 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:14.195478 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert podName:fb72504e-c4a7-4b19-a04d-f51d0fd91812 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:16.195469335 +0000 UTC m=+37.104770031 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lt772" (UID: "fb72504e-c4a7-4b19-a04d-f51d0fd91812") : secret "networking-console-plugin-cert" not found Apr 22 21:09:14.195635 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:14.195519 2546 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 21:09:14.195635 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:14.195536 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls podName:dcde08fd-6ed0-4845-a8b3-39ad205d8738 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:16.195530371 +0000 UTC m=+37.104831059 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-z95n7" (UID: "dcde08fd-6ed0-4845-a8b3-39ad205d8738") : secret "samples-operator-tls" not found Apr 22 21:09:14.499430 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:14.499343 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret\") pod \"global-pull-secret-syncer-flkgv\" (UID: \"e49b46d0-3a94-438f-bdc2-b93ad7f84cde\") " pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:09:14.511516 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:14.511454 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e49b46d0-3a94-438f-bdc2-b93ad7f84cde-original-pull-secret\") pod \"global-pull-secret-syncer-flkgv\" (UID: \"e49b46d0-3a94-438f-bdc2-b93ad7f84cde\") " pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:09:14.532926 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:14.532510 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-flkgv" Apr 22 21:09:14.706098 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:14.705793 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-flkgv"] Apr 22 21:09:14.712700 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:14.712639 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode49b46d0_3a94_438f_bdc2_b93ad7f84cde.slice/crio-b0e7f1475743fd6164ae7819bcc7a617f4d46c5218d35644e1b875a8e3b62049 WatchSource:0}: Error finding container b0e7f1475743fd6164ae7819bcc7a617f4d46c5218d35644e1b875a8e3b62049: Status 404 returned error can't find the container with id b0e7f1475743fd6164ae7819bcc7a617f4d46c5218d35644e1b875a8e3b62049 Apr 22 21:09:14.833723 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:14.833543 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-flkgv" event={"ID":"e49b46d0-3a94-438f-bdc2-b93ad7f84cde","Type":"ContainerStarted","Data":"b0e7f1475743fd6164ae7819bcc7a617f4d46c5218d35644e1b875a8e3b62049"} Apr 22 21:09:14.838617 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:14.838584 2546 generic.go:358] "Generic (PLEG): container finished" podID="0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63" containerID="040542f49edf911b3dfe4f56e4d3eeabec3840a5ff35c53ff923aac6bdefeaa6" exitCode=0 Apr 22 21:09:14.838751 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:14.838658 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5b9n" event={"ID":"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63","Type":"ContainerDied","Data":"040542f49edf911b3dfe4f56e4d3eeabec3840a5ff35c53ff923aac6bdefeaa6"} Apr 22 21:09:15.846534 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:15.846497 2546 generic.go:358] "Generic (PLEG): container finished" podID="0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63" containerID="6b02da5c228f77ed5ca19bf4b229a34cf0ea0ef50b269480bef4f6a5110f0f40" exitCode=0 Apr 22 21:09:15.847087 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:15.846561 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5b9n" event={"ID":"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63","Type":"ContainerDied","Data":"6b02da5c228f77ed5ca19bf4b229a34cf0ea0ef50b269480bef4f6a5110f0f40"} Apr 22 21:09:16.222362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:16.222249 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls\") pod \"dns-default-264s7\" (UID: \"3812e9ff-4663-427d-8f9a-a20d66323320\") " pod="openshift-dns/dns-default-264s7" Apr 22 21:09:16.222362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:16.222313 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:16.222560 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:16.222363 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:16.222560 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:16.222395 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert\") pod \"ingress-canary-xbm9h\" (UID: \"dd46fc2d-0bfc-4e1b-b429-0d14ff029faf\") " pod="openshift-ingress-canary/ingress-canary-xbm9h" Apr 22 21:09:16.222560 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:16.222430 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lt772\" (UID: \"fb72504e-c4a7-4b19-a04d-f51d0fd91812\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lt772" Apr 22 21:09:16.222560 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:16.222461 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-z95n7\" (UID: \"dcde08fd-6ed0-4845-a8b3-39ad205d8738\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7" Apr 22 21:09:16.222560 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:16.222512 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-l798c\" (UID: \"318e56a6-6fae-4d59-84d7-4a832ad3b200\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" Apr 22 21:09:16.222925 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:16.222579 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:16.222925 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:16.222763 2546 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 21:09:16.222925 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:16.222825 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs podName:0bf09f77-1234-462f-bd11-01faf61546b9 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:20.222806435 +0000 UTC m=+41.132107127 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs") pod "router-default-569c57978d-cwfqs" (UID: "0bf09f77-1234-462f-bd11-01faf61546b9") : secret "router-metrics-certs-default" not found Apr 22 21:09:16.223228 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:16.223212 2546 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:09:16.223296 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:16.223223 2546 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 21:09:16.223296 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:16.223290 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls podName:dcde08fd-6ed0-4845-a8b3-39ad205d8738 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:20.223271363 +0000 UTC m=+41.132572065 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-z95n7" (UID: "dcde08fd-6ed0-4845-a8b3-39ad205d8738") : secret "samples-operator-tls" not found Apr 22 21:09:16.223402 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:16.223319 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle podName:0bf09f77-1234-462f-bd11-01faf61546b9 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:20.223299461 +0000 UTC m=+41.132600151 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle") pod "router-default-569c57978d-cwfqs" (UID: "0bf09f77-1234-462f-bd11-01faf61546b9") : configmap references non-existent config key: service-ca.crt Apr 22 21:09:16.223402 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:16.223230 2546 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8576cd8d8-tvm8d: secret "image-registry-tls" not found Apr 22 21:09:16.223402 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:16.223335 2546 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:16.223402 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:16.223359 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls podName:aae8c028-d54c-4e48-8065-01b3b3bbc0b0 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:20.223349666 +0000 UTC m=+41.132650356 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls") pod "image-registry-8576cd8d8-tvm8d" (UID: "aae8c028-d54c-4e48-8065-01b3b3bbc0b0") : secret "image-registry-tls" not found Apr 22 21:09:16.223402 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:16.223374 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert podName:dd46fc2d-0bfc-4e1b-b429-0d14ff029faf nodeName:}" failed. No retries permitted until 2026-04-22 21:09:20.223366216 +0000 UTC m=+41.132666915 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert") pod "ingress-canary-xbm9h" (UID: "dd46fc2d-0bfc-4e1b-b429-0d14ff029faf") : secret "canary-serving-cert" not found Apr 22 21:09:16.223402 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:16.223405 2546 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:16.223773 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:16.223439 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls podName:3812e9ff-4663-427d-8f9a-a20d66323320 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:20.22342352 +0000 UTC m=+41.132724216 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls") pod "dns-default-264s7" (UID: "3812e9ff-4663-427d-8f9a-a20d66323320") : secret "dns-default-metrics-tls" not found Apr 22 21:09:16.223773 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:16.223444 2546 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 21:09:16.223773 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:16.223477 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert podName:fb72504e-c4a7-4b19-a04d-f51d0fd91812 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:20.223467223 +0000 UTC m=+41.132767918 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lt772" (UID: "fb72504e-c4a7-4b19-a04d-f51d0fd91812") : secret "networking-console-plugin-cert" not found Apr 22 21:09:16.223773 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:16.223498 2546 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 21:09:16.223773 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:16.223540 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls podName:318e56a6-6fae-4d59-84d7-4a832ad3b200 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:20.223528678 +0000 UTC m=+41.132829372 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-l798c" (UID: "318e56a6-6fae-4d59-84d7-4a832ad3b200") : secret "cluster-monitoring-operator-tls" not found Apr 22 21:09:16.997502 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:16.997400 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c"] Apr 22 21:09:17.023388 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:17.022470 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c"] Apr 22 21:09:17.023388 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:17.022624 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c" Apr 22 21:09:17.025626 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:17.025229 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 21:09:17.025626 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:17.025239 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 21:09:17.025626 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:17.025289 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 21:09:17.025914 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:17.025838 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 21:09:17.030163 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:17.029899 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl2jl\" (UniqueName: \"kubernetes.io/projected/51ebc614-b757-448c-9f12-7a0eb959556b-kube-api-access-sl2jl\") pod \"klusterlet-addon-workmgr-55f7fdf59c-qbp6c\" (UID: \"51ebc614-b757-448c-9f12-7a0eb959556b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c" Apr 22 21:09:17.030163 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:17.029943 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/51ebc614-b757-448c-9f12-7a0eb959556b-klusterlet-config\") pod \"klusterlet-addon-workmgr-55f7fdf59c-qbp6c\" (UID: \"51ebc614-b757-448c-9f12-7a0eb959556b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c" Apr 22 21:09:17.030163 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:17.029988 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51ebc614-b757-448c-9f12-7a0eb959556b-tmp\") pod \"klusterlet-addon-workmgr-55f7fdf59c-qbp6c\" (UID: \"51ebc614-b757-448c-9f12-7a0eb959556b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c" Apr 22 21:09:17.131710 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:17.131108 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sl2jl\" (UniqueName: \"kubernetes.io/projected/51ebc614-b757-448c-9f12-7a0eb959556b-kube-api-access-sl2jl\") pod \"klusterlet-addon-workmgr-55f7fdf59c-qbp6c\" (UID: \"51ebc614-b757-448c-9f12-7a0eb959556b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c" Apr 22 21:09:17.131710 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:17.131168 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/51ebc614-b757-448c-9f12-7a0eb959556b-klusterlet-config\") pod \"klusterlet-addon-workmgr-55f7fdf59c-qbp6c\" (UID: \"51ebc614-b757-448c-9f12-7a0eb959556b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c" Apr 22 21:09:17.131710 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:17.131215 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51ebc614-b757-448c-9f12-7a0eb959556b-tmp\") pod \"klusterlet-addon-workmgr-55f7fdf59c-qbp6c\" (UID: \"51ebc614-b757-448c-9f12-7a0eb959556b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c" Apr 22 21:09:17.132837 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:17.132776 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51ebc614-b757-448c-9f12-7a0eb959556b-tmp\") pod \"klusterlet-addon-workmgr-55f7fdf59c-qbp6c\" (UID: \"51ebc614-b757-448c-9f12-7a0eb959556b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c" Apr 22 21:09:17.137290 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:17.137178 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/51ebc614-b757-448c-9f12-7a0eb959556b-klusterlet-config\") pod \"klusterlet-addon-workmgr-55f7fdf59c-qbp6c\" (UID: \"51ebc614-b757-448c-9f12-7a0eb959556b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c" Apr 22 21:09:17.140801 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:17.140772 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl2jl\" (UniqueName: \"kubernetes.io/projected/51ebc614-b757-448c-9f12-7a0eb959556b-kube-api-access-sl2jl\") pod \"klusterlet-addon-workmgr-55f7fdf59c-qbp6c\" (UID: \"51ebc614-b757-448c-9f12-7a0eb959556b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c" Apr 22 21:09:17.339124 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:17.339098 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c" Apr 22 21:09:20.257497 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:20.257453 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-l798c\" (UID: \"318e56a6-6fae-4d59-84d7-4a832ad3b200\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" Apr 22 21:09:20.258285 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:20.257556 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:20.258285 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:20.257610 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls\") pod \"dns-default-264s7\" (UID: \"3812e9ff-4663-427d-8f9a-a20d66323320\") " pod="openshift-dns/dns-default-264s7" Apr 22 21:09:20.258285 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:20.257627 2546 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 21:09:20.258285 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:20.257665 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:20.258285 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:20.257708 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:20.258285 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:20.257722 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls podName:318e56a6-6fae-4d59-84d7-4a832ad3b200 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:28.257700756 +0000 UTC m=+49.167001450 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-l798c" (UID: "318e56a6-6fae-4d59-84d7-4a832ad3b200") : secret "cluster-monitoring-operator-tls" not found Apr 22 21:09:20.258285 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:20.257728 2546 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 21:09:20.258285 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:20.257739 2546 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:20.258285 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:20.257762 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert\") pod \"ingress-canary-xbm9h\" (UID: \"dd46fc2d-0bfc-4e1b-b429-0d14ff029faf\") " pod="openshift-ingress-canary/ingress-canary-xbm9h" Apr 22 21:09:20.258285 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:20.257782 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs podName:0bf09f77-1234-462f-bd11-01faf61546b9 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:28.257767534 +0000 UTC m=+49.167068243 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs") pod "router-default-569c57978d-cwfqs" (UID: "0bf09f77-1234-462f-bd11-01faf61546b9") : secret "router-metrics-certs-default" not found Apr 22 21:09:20.258285 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:20.257806 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle podName:0bf09f77-1234-462f-bd11-01faf61546b9 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:28.257794277 +0000 UTC m=+49.167094972 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle") pod "router-default-569c57978d-cwfqs" (UID: "0bf09f77-1234-462f-bd11-01faf61546b9") : configmap references non-existent config key: service-ca.crt Apr 22 21:09:20.258285 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:20.257822 2546 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:20.258285 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:20.257838 2546 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:09:20.258285 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:20.257851 2546 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8576cd8d8-tvm8d: secret "image-registry-tls" not found Apr 22 21:09:20.258285 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:20.257851 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls podName:3812e9ff-4663-427d-8f9a-a20d66323320 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:28.257820242 +0000 UTC m=+49.167120957 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls") pod "dns-default-264s7" (UID: "3812e9ff-4663-427d-8f9a-a20d66323320") : secret "dns-default-metrics-tls" not found Apr 22 21:09:20.258285 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:20.257886 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lt772\" (UID: \"fb72504e-c4a7-4b19-a04d-f51d0fd91812\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lt772" Apr 22 21:09:20.258285 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:20.257914 2546 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 21:09:20.258916 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:20.257918 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-z95n7\" (UID: \"dcde08fd-6ed0-4845-a8b3-39ad205d8738\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7" Apr 22 21:09:20.258916 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:20.257947 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert podName:fb72504e-c4a7-4b19-a04d-f51d0fd91812 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:28.257935166 +0000 UTC m=+49.167235868 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lt772" (UID: "fb72504e-c4a7-4b19-a04d-f51d0fd91812") : secret "networking-console-plugin-cert" not found Apr 22 21:09:20.258916 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:20.257970 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert podName:dd46fc2d-0bfc-4e1b-b429-0d14ff029faf nodeName:}" failed. No retries permitted until 2026-04-22 21:09:28.257958392 +0000 UTC m=+49.167259082 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert") pod "ingress-canary-xbm9h" (UID: "dd46fc2d-0bfc-4e1b-b429-0d14ff029faf") : secret "canary-serving-cert" not found Apr 22 21:09:20.258916 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:20.257983 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls podName:aae8c028-d54c-4e48-8065-01b3b3bbc0b0 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:28.257977551 +0000 UTC m=+49.167278240 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls") pod "image-registry-8576cd8d8-tvm8d" (UID: "aae8c028-d54c-4e48-8065-01b3b3bbc0b0") : secret "image-registry-tls" not found Apr 22 21:09:20.258916 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:20.257987 2546 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 21:09:20.258916 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:20.258023 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls podName:dcde08fd-6ed0-4845-a8b3-39ad205d8738 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:28.258004757 +0000 UTC m=+49.167305447 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-z95n7" (UID: "dcde08fd-6ed0-4845-a8b3-39ad205d8738") : secret "samples-operator-tls" not found Apr 22 21:09:22.993476 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:22.993450 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c"] Apr 22 21:09:22.998042 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:22.998019 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51ebc614_b757_448c_9f12_7a0eb959556b.slice/crio-7ab78b71c291e36f212c8ac7d718fea25c187056e942038e5dbbba11357eaa40 WatchSource:0}: Error finding container 7ab78b71c291e36f212c8ac7d718fea25c187056e942038e5dbbba11357eaa40: Status 404 returned error can't find the container with id 7ab78b71c291e36f212c8ac7d718fea25c187056e942038e5dbbba11357eaa40 Apr 22 21:09:23.873673 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.872930 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v5b9n" event={"ID":"0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63","Type":"ContainerStarted","Data":"f29a2a816b04dab63b81f2db8f799a20fd41183931ee23864f247855d232a172"} Apr 22 21:09:23.877893 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.876578 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg" event={"ID":"2d34322a-f3f9-4e65-8714-cec36115f27e","Type":"ContainerStarted","Data":"9bb1f0ea925dfe635fd77bcc648268481debfa0fe4d2dca7723798780c561a99"} Apr 22 21:09:23.879356 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.879178 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/0.log" Apr 22 21:09:23.879356 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.879217 2546 generic.go:358] "Generic (PLEG): container finished" podID="7e715539-0e5b-418b-a686-dfcf837def03" containerID="d33d662c4b6e6aac2c4c532dc65a30a6fbfb77b767b33718132cb0d947d94eb6" exitCode=255 Apr 22 21:09:23.879356 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.879292 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" event={"ID":"7e715539-0e5b-418b-a686-dfcf837def03","Type":"ContainerDied","Data":"d33d662c4b6e6aac2c4c532dc65a30a6fbfb77b767b33718132cb0d947d94eb6"} Apr 22 21:09:23.879593 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.879498 2546 scope.go:117] "RemoveContainer" containerID="d33d662c4b6e6aac2c4c532dc65a30a6fbfb77b767b33718132cb0d947d94eb6" Apr 22 21:09:23.885739 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.885716 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6" event={"ID":"4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347","Type":"ContainerStarted","Data":"8f4a31cf4d0f875a7b12dc17f937a8c18c6b4f0fbf2acfee4e65ae1ce0dd2d0a"} Apr 22 21:09:23.888103 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.888080 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-44nf9" event={"ID":"dd3b0638-5a78-4ca8-8274-4653bf8b98d0","Type":"ContainerStarted","Data":"2f6660720f045e31a7cfd859faa5ad4973bc96d50149a07acc13f9cdb8c1049c"} Apr 22 21:09:23.889841 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.889818 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hqxdd" event={"ID":"84ec7f24-1a5d-4fbc-98f4-38f3250c761d","Type":"ContainerStarted","Data":"6da298d99393bc5344a7545d8f53d913afbccd83cb137d059f2a959b60837d01"} Apr 22 21:09:23.891577 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.891555 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hm4qp" event={"ID":"dc5617e8-2aaf-4671-9fab-db6988f9b8b4","Type":"ContainerStarted","Data":"0447ab963458163ece5c3b70e40487eecbfacf9359ea9484840843d570544ac0"} Apr 22 21:09:23.891970 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.891951 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:09:23.893492 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.893468 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-98ckw" event={"ID":"35136bae-7913-4fe1-9aa1-51925e06b1c6","Type":"ContainerStarted","Data":"76934f1d60559b53073da04e8ddef7311caae47fff96b29c7b2b9e38798208e4"} Apr 22 21:09:23.897766 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.896828 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-flkgv" event={"ID":"e49b46d0-3a94-438f-bdc2-b93ad7f84cde","Type":"ContainerStarted","Data":"4047b28f22fee817dc1cf6999051f5ebf993f07ee13dca5a920bb6cdca670da2"} Apr 22 21:09:23.901458 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.901412 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c" event={"ID":"51ebc614-b757-448c-9f12-7a0eb959556b","Type":"ContainerStarted","Data":"7ab78b71c291e36f212c8ac7d718fea25c187056e942038e5dbbba11357eaa40"} Apr 22 21:09:23.917928 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.916504 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-v5b9n" podStartSLOduration=13.6113452 podStartE2EDuration="44.916487643s" podCreationTimestamp="2026-04-22 21:08:39 +0000 UTC" firstStartedPulling="2026-04-22 21:08:42.338758184 +0000 UTC m=+3.248058885" lastFinishedPulling="2026-04-22 21:09:13.643900636 +0000 UTC m=+34.553201328" observedRunningTime="2026-04-22 21:09:23.901119893 +0000 UTC m=+44.810420631" watchObservedRunningTime="2026-04-22 21:09:23.916487643 +0000 UTC m=+44.825788357" Apr 22 21:09:23.917928 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.916766 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6" podStartSLOduration=24.619989944 podStartE2EDuration="33.916760352s" podCreationTimestamp="2026-04-22 21:08:50 +0000 UTC" firstStartedPulling="2026-04-22 21:09:13.611954465 +0000 UTC m=+34.521255156" lastFinishedPulling="2026-04-22 21:09:22.908724874 +0000 UTC m=+43.818025564" observedRunningTime="2026-04-22 21:09:23.914874162 +0000 UTC m=+44.824174875" watchObservedRunningTime="2026-04-22 21:09:23.916760352 +0000 UTC m=+44.826061065" Apr 22 21:09:23.929593 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.929119 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-98ckw" podStartSLOduration=24.934038149 podStartE2EDuration="33.929100542s" podCreationTimestamp="2026-04-22 21:08:50 +0000 UTC" firstStartedPulling="2026-04-22 21:09:13.623069279 +0000 UTC m=+34.532369975" lastFinishedPulling="2026-04-22 21:09:22.618131667 +0000 UTC m=+43.527432368" observedRunningTime="2026-04-22 21:09:23.927617212 +0000 UTC m=+44.836917925" watchObservedRunningTime="2026-04-22 21:09:23.929100542 +0000 UTC m=+44.838401256" Apr 22 21:09:23.945004 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.944951 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-flkgv" podStartSLOduration=33.750619441 podStartE2EDuration="41.944930849s" podCreationTimestamp="2026-04-22 21:08:42 +0000 UTC" firstStartedPulling="2026-04-22 21:09:14.716107775 +0000 UTC m=+35.625408470" lastFinishedPulling="2026-04-22 21:09:22.910419186 +0000 UTC m=+43.819719878" observedRunningTime="2026-04-22 21:09:23.943575959 +0000 UTC m=+44.852876671" watchObservedRunningTime="2026-04-22 21:09:23.944930849 +0000 UTC m=+44.854231564" Apr 22 21:09:23.980071 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.979474 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-44nf9" podStartSLOduration=24.69668273 podStartE2EDuration="33.979457102s" podCreationTimestamp="2026-04-22 21:08:50 +0000 UTC" firstStartedPulling="2026-04-22 21:09:13.628021571 +0000 UTC m=+34.537322262" lastFinishedPulling="2026-04-22 21:09:22.910795929 +0000 UTC m=+43.820096634" observedRunningTime="2026-04-22 21:09:23.977972034 +0000 UTC m=+44.887272748" watchObservedRunningTime="2026-04-22 21:09:23.979457102 +0000 UTC m=+44.888757814" Apr 22 21:09:23.999394 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:23.997751 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hqxdd" podStartSLOduration=24.701009973 podStartE2EDuration="33.997733052s" podCreationTimestamp="2026-04-22 21:08:50 +0000 UTC" firstStartedPulling="2026-04-22 21:09:13.611908771 +0000 UTC m=+34.521209461" lastFinishedPulling="2026-04-22 21:09:22.908631844 +0000 UTC m=+43.817932540" observedRunningTime="2026-04-22 21:09:23.996677419 +0000 UTC m=+44.905978124" watchObservedRunningTime="2026-04-22 21:09:23.997733052 +0000 UTC m=+44.907033763" Apr 22 21:09:24.013334 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:24.012014 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hm4qp" podStartSLOduration=34.854470495 podStartE2EDuration="44.011997059s" podCreationTimestamp="2026-04-22 21:08:40 +0000 UTC" firstStartedPulling="2026-04-22 21:09:13.767836368 +0000 UTC m=+34.677137057" lastFinishedPulling="2026-04-22 21:09:22.925362917 +0000 UTC m=+43.834663621" observedRunningTime="2026-04-22 21:09:24.011768571 +0000 UTC m=+44.921069284" watchObservedRunningTime="2026-04-22 21:09:24.011997059 +0000 UTC m=+44.921297769" Apr 22 21:09:24.906762 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:24.906737 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/1.log" Apr 22 21:09:24.907212 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:24.907188 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/0.log" Apr 22 21:09:24.907339 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:24.907235 2546 generic.go:358] "Generic (PLEG): container finished" podID="7e715539-0e5b-418b-a686-dfcf837def03" containerID="c36b25e802d0dac0dacfc25d511ea95bfe8634e84017faddfb39e8832354759e" exitCode=255 Apr 22 21:09:24.907409 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:24.907349 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" event={"ID":"7e715539-0e5b-418b-a686-dfcf837def03","Type":"ContainerDied","Data":"c36b25e802d0dac0dacfc25d511ea95bfe8634e84017faddfb39e8832354759e"} Apr 22 21:09:24.907409 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:24.907399 2546 scope.go:117] "RemoveContainer" containerID="d33d662c4b6e6aac2c4c532dc65a30a6fbfb77b767b33718132cb0d947d94eb6" Apr 22 21:09:24.907607 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:24.907590 2546 scope.go:117] "RemoveContainer" containerID="c36b25e802d0dac0dacfc25d511ea95bfe8634e84017faddfb39e8832354759e" Apr 22 21:09:24.907844 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:24.907823 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2rdf2_openshift-console-operator(7e715539-0e5b-418b-a686-dfcf837def03)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" podUID="7e715539-0e5b-418b-a686-dfcf837def03" Apr 22 21:09:24.928423 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:24.928371 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg" podStartSLOduration=25.650451606 podStartE2EDuration="34.928352279s" podCreationTimestamp="2026-04-22 21:08:50 +0000 UTC" firstStartedPulling="2026-04-22 21:09:13.631688504 +0000 UTC m=+34.540989195" lastFinishedPulling="2026-04-22 21:09:22.909589161 +0000 UTC m=+43.818889868" observedRunningTime="2026-04-22 21:09:24.028534665 +0000 UTC m=+44.937835377" watchObservedRunningTime="2026-04-22 21:09:24.928352279 +0000 UTC m=+45.837652992" Apr 22 21:09:25.911051 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:25.911025 2546 scope.go:117] "RemoveContainer" containerID="c36b25e802d0dac0dacfc25d511ea95bfe8634e84017faddfb39e8832354759e" Apr 22 21:09:25.911481 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:25.911221 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2rdf2_openshift-console-operator(7e715539-0e5b-418b-a686-dfcf837def03)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" podUID="7e715539-0e5b-418b-a686-dfcf837def03" Apr 22 21:09:26.641027 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:26.640998 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-v6vgf_fbb54413-9542-4220-90ec-df660b4891fa/dns-node-resolver/0.log" Apr 22 21:09:26.915529 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:26.915458 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/1.log" Apr 22 21:09:27.641290 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:27.641261 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zbnft_ee933643-0139-4bfd-9d0d-903d3404df30/node-ca/0.log" Apr 22 21:09:27.854401 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:27.854371 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-gw4hr"] Apr 22 21:09:27.870365 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:27.870330 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-gw4hr"] Apr 22 21:09:27.870525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:27.870463 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-gw4hr" Apr 22 21:09:27.872768 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:27.872748 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 21:09:27.873532 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:27.873517 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 21:09:27.873669 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:27.873516 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 21:09:27.873669 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:27.873519 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-s5bgw\"" Apr 22 21:09:27.873867 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:27.873852 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 21:09:27.946592 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:27.946518 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd-signing-cabundle\") pod \"service-ca-865cb79987-gw4hr\" (UID: \"4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd\") " pod="openshift-service-ca/service-ca-865cb79987-gw4hr" Apr 22 21:09:27.946940 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:27.946712 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd-signing-key\") pod \"service-ca-865cb79987-gw4hr\" (UID: \"4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd\") " pod="openshift-service-ca/service-ca-865cb79987-gw4hr" Apr 22 21:09:27.946940 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:27.946738 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9zz8\" (UniqueName: \"kubernetes.io/projected/4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd-kube-api-access-g9zz8\") pod \"service-ca-865cb79987-gw4hr\" (UID: \"4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd\") " pod="openshift-service-ca/service-ca-865cb79987-gw4hr" Apr 22 21:09:28.047404 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.047374 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd-signing-key\") pod \"service-ca-865cb79987-gw4hr\" (UID: \"4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd\") " pod="openshift-service-ca/service-ca-865cb79987-gw4hr" Apr 22 21:09:28.047520 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.047408 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9zz8\" (UniqueName: \"kubernetes.io/projected/4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd-kube-api-access-g9zz8\") pod \"service-ca-865cb79987-gw4hr\" (UID: \"4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd\") " pod="openshift-service-ca/service-ca-865cb79987-gw4hr" Apr 22 21:09:28.047520 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.047479 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd-signing-cabundle\") pod \"service-ca-865cb79987-gw4hr\" (UID: \"4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd\") " pod="openshift-service-ca/service-ca-865cb79987-gw4hr" Apr 22 21:09:28.048073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.048052 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd-signing-cabundle\") pod \"service-ca-865cb79987-gw4hr\" (UID: \"4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd\") " pod="openshift-service-ca/service-ca-865cb79987-gw4hr" Apr 22 21:09:28.050073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.050056 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd-signing-key\") pod \"service-ca-865cb79987-gw4hr\" (UID: \"4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd\") " pod="openshift-service-ca/service-ca-865cb79987-gw4hr" Apr 22 21:09:28.054708 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.054684 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9zz8\" (UniqueName: \"kubernetes.io/projected/4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd-kube-api-access-g9zz8\") pod \"service-ca-865cb79987-gw4hr\" (UID: \"4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd\") " pod="openshift-service-ca/service-ca-865cb79987-gw4hr" Apr 22 21:09:28.180077 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.180050 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-gw4hr" Apr 22 21:09:28.297340 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.297306 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-gw4hr"] Apr 22 21:09:28.350259 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.350232 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-l798c\" (UID: \"318e56a6-6fae-4d59-84d7-4a832ad3b200\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" Apr 22 21:09:28.350348 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.350314 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:28.350426 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.350371 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls\") pod \"dns-default-264s7\" (UID: \"3812e9ff-4663-427d-8f9a-a20d66323320\") " pod="openshift-dns/dns-default-264s7" Apr 22 21:09:28.350426 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:28.350397 2546 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 21:09:28.350426 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:28.350404 2546 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 21:09:28.350624 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.350409 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:28.350624 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:28.350457 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls podName:318e56a6-6fae-4d59-84d7-4a832ad3b200 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:44.350436983 +0000 UTC m=+65.259737686 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-l798c" (UID: "318e56a6-6fae-4d59-84d7-4a832ad3b200") : secret "cluster-monitoring-operator-tls" not found Apr 22 21:09:28.350624 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:28.350480 2546 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:09:28.350624 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:28.350482 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs podName:0bf09f77-1234-462f-bd11-01faf61546b9 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:44.350472212 +0000 UTC m=+65.259772901 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs") pod "router-default-569c57978d-cwfqs" (UID: "0bf09f77-1234-462f-bd11-01faf61546b9") : secret "router-metrics-certs-default" not found Apr 22 21:09:28.350624 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:28.350494 2546 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8576cd8d8-tvm8d: secret "image-registry-tls" not found Apr 22 21:09:28.350624 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:28.350508 2546 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:28.350624 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:28.350544 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls podName:aae8c028-d54c-4e48-8065-01b3b3bbc0b0 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:44.350527579 +0000 UTC m=+65.259828274 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls") pod "image-registry-8576cd8d8-tvm8d" (UID: "aae8c028-d54c-4e48-8065-01b3b3bbc0b0") : secret "image-registry-tls" not found Apr 22 21:09:28.350624 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.350538 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:28.350624 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:28.350561 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls podName:3812e9ff-4663-427d-8f9a-a20d66323320 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:44.350550007 +0000 UTC m=+65.259850719 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls") pod "dns-default-264s7" (UID: "3812e9ff-4663-427d-8f9a-a20d66323320") : secret "dns-default-metrics-tls" not found Apr 22 21:09:28.350624 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.350594 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert\") pod \"ingress-canary-xbm9h\" (UID: \"dd46fc2d-0bfc-4e1b-b429-0d14ff029faf\") " pod="openshift-ingress-canary/ingress-canary-xbm9h" Apr 22 21:09:28.350624 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:28.350599 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle podName:0bf09f77-1234-462f-bd11-01faf61546b9 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:44.350588938 +0000 UTC m=+65.259889642 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle") pod "router-default-569c57978d-cwfqs" (UID: "0bf09f77-1234-462f-bd11-01faf61546b9") : configmap references non-existent config key: service-ca.crt Apr 22 21:09:28.351037 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.350636 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lt772\" (UID: \"fb72504e-c4a7-4b19-a04d-f51d0fd91812\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lt772" Apr 22 21:09:28.351037 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:28.350668 2546 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:28.351037 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.350685 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-z95n7\" (UID: \"dcde08fd-6ed0-4845-a8b3-39ad205d8738\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7" Apr 22 21:09:28.351037 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:28.350699 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert podName:dd46fc2d-0bfc-4e1b-b429-0d14ff029faf nodeName:}" failed. No retries permitted until 2026-04-22 21:09:44.350688598 +0000 UTC m=+65.259989302 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert") pod "ingress-canary-xbm9h" (UID: "dd46fc2d-0bfc-4e1b-b429-0d14ff029faf") : secret "canary-serving-cert" not found Apr 22 21:09:28.351037 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:28.350743 2546 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 21:09:28.351037 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:28.350772 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls podName:dcde08fd-6ed0-4845-a8b3-39ad205d8738 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:44.350763662 +0000 UTC m=+65.260064353 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-z95n7" (UID: "dcde08fd-6ed0-4845-a8b3-39ad205d8738") : secret "samples-operator-tls" not found Apr 22 21:09:28.351037 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:28.350787 2546 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 21:09:28.351037 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:28.350833 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert podName:fb72504e-c4a7-4b19-a04d-f51d0fd91812 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:44.350818081 +0000 UTC m=+65.260118770 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lt772" (UID: "fb72504e-c4a7-4b19-a04d-f51d0fd91812") : secret "networking-console-plugin-cert" not found Apr 22 21:09:28.924536 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.924501 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-gw4hr" event={"ID":"4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd","Type":"ContainerStarted","Data":"bb7e6157012995692588be20145231cd7640b2454a6e9cd54b16806d874b1b1e"} Apr 22 21:09:28.924926 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.924543 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-gw4hr" event={"ID":"4cf7c49e-b8fc-4377-ae66-3e85ee8e96cd","Type":"ContainerStarted","Data":"f1a0fc445a421efba7ec3d6714bc875241138679efb4d2848c210fbb8c302e1c"} Apr 22 21:09:28.925902 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.925881 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c" event={"ID":"51ebc614-b757-448c-9f12-7a0eb959556b","Type":"ContainerStarted","Data":"d4c67add9a78532203dbd9534bbfaec273706bf97a2ae2b76a2a87b977ed45b1"} Apr 22 21:09:28.926127 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.926112 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c" Apr 22 21:09:28.927940 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.927920 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c" Apr 22 21:09:28.939688 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.939625 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-gw4hr" podStartSLOduration=1.939610048 podStartE2EDuration="1.939610048s" podCreationTimestamp="2026-04-22 21:09:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:09:28.938425552 +0000 UTC m=+49.847726265" watchObservedRunningTime="2026-04-22 21:09:28.939610048 +0000 UTC m=+49.848910759" Apr 22 21:09:28.953115 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:28.953077 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-55f7fdf59c-qbp6c" podStartSLOduration=7.931514384 podStartE2EDuration="12.953068336s" podCreationTimestamp="2026-04-22 21:09:16 +0000 UTC" firstStartedPulling="2026-04-22 21:09:22.999572724 +0000 UTC m=+43.908873420" lastFinishedPulling="2026-04-22 21:09:28.02112667 +0000 UTC m=+48.930427372" observedRunningTime="2026-04-22 21:09:28.952673107 +0000 UTC m=+49.861973819" watchObservedRunningTime="2026-04-22 21:09:28.953068336 +0000 UTC m=+49.862369062" Apr 22 21:09:32.805433 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:32.805394 2546 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" Apr 22 21:09:32.805864 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:32.805450 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" Apr 22 21:09:32.805864 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:32.805853 2546 scope.go:117] "RemoveContainer" containerID="c36b25e802d0dac0dacfc25d511ea95bfe8634e84017faddfb39e8832354759e" Apr 22 21:09:32.806060 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:09:32.806039 2546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2rdf2_openshift-console-operator(7e715539-0e5b-418b-a686-dfcf837def03)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" podUID="7e715539-0e5b-418b-a686-dfcf837def03" Apr 22 21:09:37.822153 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:37.822128 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6flqf" Apr 22 21:09:44.400372 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.400326 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-l798c\" (UID: \"318e56a6-6fae-4d59-84d7-4a832ad3b200\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" Apr 22 21:09:44.400871 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.400405 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:44.400871 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.400435 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls\") pod \"dns-default-264s7\" (UID: \"3812e9ff-4663-427d-8f9a-a20d66323320\") " pod="openshift-dns/dns-default-264s7" Apr 22 21:09:44.400871 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.400464 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:44.400871 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.400493 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:44.400871 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.400675 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert\") pod \"ingress-canary-xbm9h\" (UID: \"dd46fc2d-0bfc-4e1b-b429-0d14ff029faf\") " pod="openshift-ingress-canary/ingress-canary-xbm9h" Apr 22 21:09:44.400871 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.400731 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lt772\" (UID: \"fb72504e-c4a7-4b19-a04d-f51d0fd91812\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lt772" Apr 22 21:09:44.400871 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.400763 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-z95n7\" (UID: \"dcde08fd-6ed0-4845-a8b3-39ad205d8738\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7" Apr 22 21:09:44.401440 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.401371 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf09f77-1234-462f-bd11-01faf61546b9-service-ca-bundle\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:44.403357 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.403302 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd46fc2d-0bfc-4e1b-b429-0d14ff029faf-cert\") pod \"ingress-canary-xbm9h\" (UID: \"dd46fc2d-0bfc-4e1b-b429-0d14ff029faf\") " pod="openshift-ingress-canary/ingress-canary-xbm9h" Apr 22 21:09:44.403807 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.403783 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3812e9ff-4663-427d-8f9a-a20d66323320-metrics-tls\") pod \"dns-default-264s7\" (UID: \"3812e9ff-4663-427d-8f9a-a20d66323320\") " pod="openshift-dns/dns-default-264s7" Apr 22 21:09:44.403936 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.403912 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/318e56a6-6fae-4d59-84d7-4a832ad3b200-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-l798c\" (UID: \"318e56a6-6fae-4d59-84d7-4a832ad3b200\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" Apr 22 21:09:44.403981 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.403941 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dcde08fd-6ed0-4845-a8b3-39ad205d8738-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-z95n7\" (UID: \"dcde08fd-6ed0-4845-a8b3-39ad205d8738\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7" Apr 22 21:09:44.403981 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.403945 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls\") pod \"image-registry-8576cd8d8-tvm8d\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:44.404068 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.404052 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fb72504e-c4a7-4b19-a04d-f51d0fd91812-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lt772\" (UID: \"fb72504e-c4a7-4b19-a04d-f51d0fd91812\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lt772" Apr 22 21:09:44.404369 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.404349 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0bf09f77-1234-462f-bd11-01faf61546b9-metrics-certs\") pod \"router-default-569c57978d-cwfqs\" (UID: \"0bf09f77-1234-462f-bd11-01faf61546b9\") " pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:44.540570 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.540541 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jsc9g\"" Apr 22 21:09:44.549275 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.549254 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:44.570043 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.570018 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-v2j4k\"" Apr 22 21:09:44.578723 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.578675 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7" Apr 22 21:09:44.600084 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.600035 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-bxtgl\"" Apr 22 21:09:44.608086 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.608054 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" Apr 22 21:09:44.615268 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.615243 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7ll98\"" Apr 22 21:09:44.624030 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.624002 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-264s7" Apr 22 21:09:44.631103 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.630725 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-gtqgh\"" Apr 22 21:09:44.638234 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.638118 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-c86tt\"" Apr 22 21:09:44.638470 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.638350 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lt772" Apr 22 21:09:44.652799 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.652033 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xbm9h" Apr 22 21:09:44.656120 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.655805 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-w4bt5\"" Apr 22 21:09:44.662637 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.662588 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:44.703043 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.702567 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8576cd8d8-tvm8d"] Apr 22 21:09:44.716748 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:44.716660 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaae8c028_d54c_4e48_8065_01b3b3bbc0b0.slice/crio-1f6110270096e468bb9538bd15875595f01152841aca98124eea69d56d9b694c WatchSource:0}: Error finding container 1f6110270096e468bb9538bd15875595f01152841aca98124eea69d56d9b694c: Status 404 returned error can't find the container with id 1f6110270096e468bb9538bd15875595f01152841aca98124eea69d56d9b694c Apr 22 21:09:44.734255 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.734221 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7"] Apr 22 21:09:44.838667 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.838608 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c"] Apr 22 21:09:44.850854 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.850609 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-264s7"] Apr 22 21:09:44.852185 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:44.852136 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod318e56a6_6fae_4d59_84d7_4a832ad3b200.slice/crio-166c62b310f9d491cfc83b8542cb0c16a8c6043505c66d1be743fea5e4373df2 WatchSource:0}: Error finding container 166c62b310f9d491cfc83b8542cb0c16a8c6043505c66d1be743fea5e4373df2: Status 404 returned error can't find the container with id 166c62b310f9d491cfc83b8542cb0c16a8c6043505c66d1be743fea5e4373df2 Apr 22 21:09:44.859549 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:44.859515 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3812e9ff_4663_427d_8f9a_a20d66323320.slice/crio-88999d7e9d14d4f2fe56d3bb3fc85e170a0619a168c453938f4d750f4668fac0 WatchSource:0}: Error finding container 88999d7e9d14d4f2fe56d3bb3fc85e170a0619a168c453938f4d750f4668fac0: Status 404 returned error can't find the container with id 88999d7e9d14d4f2fe56d3bb3fc85e170a0619a168c453938f4d750f4668fac0 Apr 22 21:09:44.882008 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.881980 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-lt772"] Apr 22 21:09:44.885273 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:44.885242 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb72504e_c4a7_4b19_a04d_f51d0fd91812.slice/crio-2fd5342565154e6a2b6b70f2d7aed919b7aaa616cf98f56c191552dd969fc11b WatchSource:0}: Error finding container 2fd5342565154e6a2b6b70f2d7aed919b7aaa616cf98f56c191552dd969fc11b: Status 404 returned error can't find the container with id 2fd5342565154e6a2b6b70f2d7aed919b7aaa616cf98f56c191552dd969fc11b Apr 22 21:09:44.966028 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.965993 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" event={"ID":"318e56a6-6fae-4d59-84d7-4a832ad3b200","Type":"ContainerStarted","Data":"166c62b310f9d491cfc83b8542cb0c16a8c6043505c66d1be743fea5e4373df2"} Apr 22 21:09:44.967278 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.967253 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" event={"ID":"aae8c028-d54c-4e48-8065-01b3b3bbc0b0","Type":"ContainerStarted","Data":"8f1ee2cf48d29d5170528137615f9e8cdc52f5e8080e8aa19ce0162a547b344e"} Apr 22 21:09:44.967406 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.967283 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" event={"ID":"aae8c028-d54c-4e48-8065-01b3b3bbc0b0","Type":"ContainerStarted","Data":"1f6110270096e468bb9538bd15875595f01152841aca98124eea69d56d9b694c"} Apr 22 21:09:44.967406 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.967373 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:09:44.968331 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.968300 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-264s7" event={"ID":"3812e9ff-4663-427d-8f9a-a20d66323320","Type":"ContainerStarted","Data":"88999d7e9d14d4f2fe56d3bb3fc85e170a0619a168c453938f4d750f4668fac0"} Apr 22 21:09:44.969339 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.969319 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lt772" event={"ID":"fb72504e-c4a7-4b19-a04d-f51d0fd91812","Type":"ContainerStarted","Data":"2fd5342565154e6a2b6b70f2d7aed919b7aaa616cf98f56c191552dd969fc11b"} Apr 22 21:09:44.970214 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.970193 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7" event={"ID":"dcde08fd-6ed0-4845-a8b3-39ad205d8738","Type":"ContainerStarted","Data":"f47edd949dfbcad1ddfcf6f661d828bb7fbb9c7bbecef4afa36927800ecfa5f9"} Apr 22 21:09:44.984487 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:44.984451 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" podStartSLOduration=64.984440112 podStartE2EDuration="1m4.984440112s" podCreationTimestamp="2026-04-22 21:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:09:44.983752999 +0000 UTC m=+65.893053711" watchObservedRunningTime="2026-04-22 21:09:44.984440112 +0000 UTC m=+65.893740823" Apr 22 21:09:45.107488 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:45.107451 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xbm9h"] Apr 22 21:09:45.110349 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:45.110320 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd46fc2d_0bfc_4e1b_b429_0d14ff029faf.slice/crio-d22fddafc2daec6b6345d1632292cd5ae33cf8311eff35bbddee945ef8408c24 WatchSource:0}: Error finding container d22fddafc2daec6b6345d1632292cd5ae33cf8311eff35bbddee945ef8408c24: Status 404 returned error can't find the container with id d22fddafc2daec6b6345d1632292cd5ae33cf8311eff35bbddee945ef8408c24 Apr 22 21:09:45.114409 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:45.114386 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-569c57978d-cwfqs"] Apr 22 21:09:45.117321 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:45.117295 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bf09f77_1234_462f_bd11_01faf61546b9.slice/crio-681b1272fd0352f25aa053a5953e8d57e2da89d9fe1e1067222eb5c6a116173b WatchSource:0}: Error finding container 681b1272fd0352f25aa053a5953e8d57e2da89d9fe1e1067222eb5c6a116173b: Status 404 returned error can't find the container with id 681b1272fd0352f25aa053a5953e8d57e2da89d9fe1e1067222eb5c6a116173b Apr 22 21:09:45.410597 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:45.410559 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs\") pod \"network-metrics-daemon-4476p\" (UID: \"046a767f-6fe0-4c57-ba82-6a2ce75f7864\") " pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:09:45.413435 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:45.413407 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046a767f-6fe0-4c57-ba82-6a2ce75f7864-metrics-certs\") pod \"network-metrics-daemon-4476p\" (UID: \"046a767f-6fe0-4c57-ba82-6a2ce75f7864\") " pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:09:45.429979 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:45.429902 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ccfht\"" Apr 22 21:09:45.436940 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:45.436916 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4476p" Apr 22 21:09:45.603586 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:45.603081 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4476p"] Apr 22 21:09:45.614501 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:45.614462 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod046a767f_6fe0_4c57_ba82_6a2ce75f7864.slice/crio-643c1b6f3c542b3ba6fe1acca27ac159971651d99dc2abd849b66e4a6a4d127b WatchSource:0}: Error finding container 643c1b6f3c542b3ba6fe1acca27ac159971651d99dc2abd849b66e4a6a4d127b: Status 404 returned error can't find the container with id 643c1b6f3c542b3ba6fe1acca27ac159971651d99dc2abd849b66e4a6a4d127b Apr 22 21:09:45.978861 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:45.978815 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4476p" event={"ID":"046a767f-6fe0-4c57-ba82-6a2ce75f7864","Type":"ContainerStarted","Data":"643c1b6f3c542b3ba6fe1acca27ac159971651d99dc2abd849b66e4a6a4d127b"} Apr 22 21:09:45.982172 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:45.982138 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xbm9h" event={"ID":"dd46fc2d-0bfc-4e1b-b429-0d14ff029faf","Type":"ContainerStarted","Data":"d22fddafc2daec6b6345d1632292cd5ae33cf8311eff35bbddee945ef8408c24"} Apr 22 21:09:45.984352 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:45.984322 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-569c57978d-cwfqs" event={"ID":"0bf09f77-1234-462f-bd11-01faf61546b9","Type":"ContainerStarted","Data":"492d158e76dca985a7c4ce9385453203a90b51f895db86058774e92099478d5d"} Apr 22 21:09:45.984498 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:45.984362 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-569c57978d-cwfqs" event={"ID":"0bf09f77-1234-462f-bd11-01faf61546b9","Type":"ContainerStarted","Data":"681b1272fd0352f25aa053a5953e8d57e2da89d9fe1e1067222eb5c6a116173b"} Apr 22 21:09:46.004258 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:46.003272 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-569c57978d-cwfqs" podStartSLOduration=56.003256999 podStartE2EDuration="56.003256999s" podCreationTimestamp="2026-04-22 21:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:09:46.002468119 +0000 UTC m=+66.911768829" watchObservedRunningTime="2026-04-22 21:09:46.003256999 +0000 UTC m=+66.912557701" Apr 22 21:09:46.663533 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:46.663503 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:46.666191 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:46.666168 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:46.676232 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:46.676206 2546 scope.go:117] "RemoveContainer" containerID="c36b25e802d0dac0dacfc25d511ea95bfe8634e84017faddfb39e8832354759e" Apr 22 21:09:46.987283 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:46.987207 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:46.988403 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:46.988380 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-569c57978d-cwfqs" Apr 22 21:09:49.999438 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:49.999397 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" event={"ID":"318e56a6-6fae-4d59-84d7-4a832ad3b200","Type":"ContainerStarted","Data":"d31be51dbb6aa028ae200339fd06a0f6826d75ec296604a35d61e961c05d3758"} Apr 22 21:09:50.002049 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:50.001998 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/1.log" Apr 22 21:09:50.002195 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:50.002099 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" event={"ID":"7e715539-0e5b-418b-a686-dfcf837def03","Type":"ContainerStarted","Data":"3169cb34d58d957d3784d83562541726181b98477be791d01f25bafe12af6b83"} Apr 22 21:09:50.002626 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:50.002575 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" Apr 22 21:09:50.004862 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:50.004699 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-264s7" event={"ID":"3812e9ff-4663-427d-8f9a-a20d66323320","Type":"ContainerStarted","Data":"cb97d90d7e6816281e161cd4af65de46a7f5f185e4fc317ac11f80bec942006e"} Apr 22 21:09:50.009679 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:50.007482 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lt772" event={"ID":"fb72504e-c4a7-4b19-a04d-f51d0fd91812","Type":"ContainerStarted","Data":"2b3b03c351844384ace08955a50ada164f2da5a894a420a2aa783e7ff5f81813"} Apr 22 21:09:50.013143 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:50.009787 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7" event={"ID":"dcde08fd-6ed0-4845-a8b3-39ad205d8738","Type":"ContainerStarted","Data":"5f0d2ef4030d3a138296dfa93994e3aaa1aa64694442f3b38b43968a061e98e7"} Apr 22 21:09:50.013143 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:50.012559 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xbm9h" event={"ID":"dd46fc2d-0bfc-4e1b-b429-0d14ff029faf","Type":"ContainerStarted","Data":"984082a5aa6e3b4289e7056f778680e816b2b70ddb4926c57b627096fd4cdeb9"} Apr 22 21:09:50.013414 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:50.013371 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" Apr 22 21:09:50.022719 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:50.022411 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-l798c" podStartSLOduration=55.258863184 podStartE2EDuration="1m0.022395232s" podCreationTimestamp="2026-04-22 21:08:50 +0000 UTC" firstStartedPulling="2026-04-22 21:09:44.856612117 +0000 UTC m=+65.765912815" lastFinishedPulling="2026-04-22 21:09:49.620144173 +0000 UTC m=+70.529444863" observedRunningTime="2026-04-22 21:09:50.021061459 +0000 UTC m=+70.930362170" watchObservedRunningTime="2026-04-22 21:09:50.022395232 +0000 UTC m=+70.931695943" Apr 22 21:09:50.040993 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:50.040924 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xbm9h" podStartSLOduration=33.532889654 podStartE2EDuration="38.040906197s" podCreationTimestamp="2026-04-22 21:09:12 +0000 UTC" firstStartedPulling="2026-04-22 21:09:45.112134906 +0000 UTC m=+66.021435596" lastFinishedPulling="2026-04-22 21:09:49.620151447 +0000 UTC m=+70.529452139" observedRunningTime="2026-04-22 21:09:50.039573273 +0000 UTC m=+70.948873984" watchObservedRunningTime="2026-04-22 21:09:50.040906197 +0000 UTC m=+70.950206904" Apr 22 21:09:50.065768 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:50.065715 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-2rdf2" podStartSLOduration=51.079005614 podStartE2EDuration="1m0.065701349s" podCreationTimestamp="2026-04-22 21:08:50 +0000 UTC" firstStartedPulling="2026-04-22 21:09:13.631440842 +0000 UTC m=+34.540741535" lastFinishedPulling="2026-04-22 21:09:22.618136578 +0000 UTC m=+43.527437270" observedRunningTime="2026-04-22 21:09:50.06520009 +0000 UTC m=+70.974500803" watchObservedRunningTime="2026-04-22 21:09:50.065701349 +0000 UTC m=+70.975002054" Apr 22 21:09:50.066476 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:50.066446 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lt772" podStartSLOduration=41.33846874 podStartE2EDuration="46.066439205s" podCreationTimestamp="2026-04-22 21:09:04 +0000 UTC" firstStartedPulling="2026-04-22 21:09:44.892213117 +0000 UTC m=+65.801513806" lastFinishedPulling="2026-04-22 21:09:49.620183568 +0000 UTC m=+70.529484271" observedRunningTime="2026-04-22 21:09:50.052487058 +0000 UTC m=+70.961787793" watchObservedRunningTime="2026-04-22 21:09:50.066439205 +0000 UTC m=+70.975739919" Apr 22 21:09:51.018589 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:51.018556 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4476p" event={"ID":"046a767f-6fe0-4c57-ba82-6a2ce75f7864","Type":"ContainerStarted","Data":"fbac3d3871db58074c73a68d21245403f981a7da1d28d8dd57dadd87e81b633f"} Apr 22 21:09:51.018589 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:51.018593 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4476p" event={"ID":"046a767f-6fe0-4c57-ba82-6a2ce75f7864","Type":"ContainerStarted","Data":"9be39bfe5795b78302c1116f85ce75af5b8030de3bbbcf1ddabec2e08c8c4678"} Apr 22 21:09:51.020303 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:51.020273 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-264s7" event={"ID":"3812e9ff-4663-427d-8f9a-a20d66323320","Type":"ContainerStarted","Data":"2579f12153d93139c67b87baa73b50ecf554208a8daa8449eaa60d93fe5c5535"} Apr 22 21:09:51.020434 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:51.020406 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-264s7" Apr 22 21:09:51.021920 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:51.021897 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7" event={"ID":"dcde08fd-6ed0-4845-a8b3-39ad205d8738","Type":"ContainerStarted","Data":"dbc2f2dcd3e5c709eec94362439f58be49186501379503bdfc20ce90c1d66aab"} Apr 22 21:09:51.035835 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:51.035795 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4476p" podStartSLOduration=66.56787818 podStartE2EDuration="1m11.035783505s" podCreationTimestamp="2026-04-22 21:08:40 +0000 UTC" firstStartedPulling="2026-04-22 21:09:45.62020284 +0000 UTC m=+66.529503533" lastFinishedPulling="2026-04-22 21:09:50.088108167 +0000 UTC m=+70.997408858" observedRunningTime="2026-04-22 21:09:51.034455199 +0000 UTC m=+71.943755924" watchObservedRunningTime="2026-04-22 21:09:51.035783505 +0000 UTC m=+71.945084216" Apr 22 21:09:51.048620 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:51.048577 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-z95n7" podStartSLOduration=56.271849532 podStartE2EDuration="1m1.048562788s" podCreationTimestamp="2026-04-22 21:08:50 +0000 UTC" firstStartedPulling="2026-04-22 21:09:44.843082697 +0000 UTC m=+65.752383393" lastFinishedPulling="2026-04-22 21:09:49.619795946 +0000 UTC m=+70.529096649" observedRunningTime="2026-04-22 21:09:51.047840914 +0000 UTC m=+71.957141626" watchObservedRunningTime="2026-04-22 21:09:51.048562788 +0000 UTC m=+71.957863500" Apr 22 21:09:51.062609 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:51.062567 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-264s7" podStartSLOduration=34.304102047 podStartE2EDuration="39.062552611s" podCreationTimestamp="2026-04-22 21:09:12 +0000 UTC" firstStartedPulling="2026-04-22 21:09:44.861343291 +0000 UTC m=+65.770643979" lastFinishedPulling="2026-04-22 21:09:49.619793848 +0000 UTC m=+70.529094543" observedRunningTime="2026-04-22 21:09:51.061089194 +0000 UTC m=+71.970389918" watchObservedRunningTime="2026-04-22 21:09:51.062552611 +0000 UTC m=+71.971853326" Apr 22 21:09:52.024561 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.024531 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-q4bdt"] Apr 22 21:09:52.030032 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.029769 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-q4bdt" Apr 22 21:09:52.032828 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.032807 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 21:09:52.034480 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.034457 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-7tkh8\"" Apr 22 21:09:52.034785 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.034499 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 21:09:52.036574 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.036553 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-q4bdt"] Apr 22 21:09:52.068242 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.068214 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4hjj\" (UniqueName: \"kubernetes.io/projected/b818737b-3647-434e-b75c-fa264c7a4dd2-kube-api-access-x4hjj\") pod \"insights-runtime-extractor-q4bdt\" (UID: \"b818737b-3647-434e-b75c-fa264c7a4dd2\") " pod="openshift-insights/insights-runtime-extractor-q4bdt" Apr 22 21:09:52.068583 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.068555 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b818737b-3647-434e-b75c-fa264c7a4dd2-crio-socket\") pod \"insights-runtime-extractor-q4bdt\" (UID: \"b818737b-3647-434e-b75c-fa264c7a4dd2\") " pod="openshift-insights/insights-runtime-extractor-q4bdt" Apr 22 21:09:52.068726 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.068609 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b818737b-3647-434e-b75c-fa264c7a4dd2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q4bdt\" (UID: \"b818737b-3647-434e-b75c-fa264c7a4dd2\") " pod="openshift-insights/insights-runtime-extractor-q4bdt" Apr 22 21:09:52.068726 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.068639 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b818737b-3647-434e-b75c-fa264c7a4dd2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q4bdt\" (UID: \"b818737b-3647-434e-b75c-fa264c7a4dd2\") " pod="openshift-insights/insights-runtime-extractor-q4bdt" Apr 22 21:09:52.068861 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.068831 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b818737b-3647-434e-b75c-fa264c7a4dd2-data-volume\") pod \"insights-runtime-extractor-q4bdt\" (UID: \"b818737b-3647-434e-b75c-fa264c7a4dd2\") " pod="openshift-insights/insights-runtime-extractor-q4bdt" Apr 22 21:09:52.170134 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.170101 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b818737b-3647-434e-b75c-fa264c7a4dd2-data-volume\") pod \"insights-runtime-extractor-q4bdt\" (UID: \"b818737b-3647-434e-b75c-fa264c7a4dd2\") " pod="openshift-insights/insights-runtime-extractor-q4bdt" Apr 22 21:09:52.170305 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.170148 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4hjj\" (UniqueName: \"kubernetes.io/projected/b818737b-3647-434e-b75c-fa264c7a4dd2-kube-api-access-x4hjj\") pod \"insights-runtime-extractor-q4bdt\" (UID: \"b818737b-3647-434e-b75c-fa264c7a4dd2\") " pod="openshift-insights/insights-runtime-extractor-q4bdt" Apr 22 21:09:52.170305 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.170197 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b818737b-3647-434e-b75c-fa264c7a4dd2-crio-socket\") pod \"insights-runtime-extractor-q4bdt\" (UID: \"b818737b-3647-434e-b75c-fa264c7a4dd2\") " pod="openshift-insights/insights-runtime-extractor-q4bdt" Apr 22 21:09:52.170305 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.170222 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b818737b-3647-434e-b75c-fa264c7a4dd2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q4bdt\" (UID: \"b818737b-3647-434e-b75c-fa264c7a4dd2\") " pod="openshift-insights/insights-runtime-extractor-q4bdt" Apr 22 21:09:52.170305 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.170240 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b818737b-3647-434e-b75c-fa264c7a4dd2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q4bdt\" (UID: \"b818737b-3647-434e-b75c-fa264c7a4dd2\") " pod="openshift-insights/insights-runtime-extractor-q4bdt" Apr 22 21:09:52.170463 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.170321 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b818737b-3647-434e-b75c-fa264c7a4dd2-crio-socket\") pod \"insights-runtime-extractor-q4bdt\" (UID: \"b818737b-3647-434e-b75c-fa264c7a4dd2\") " pod="openshift-insights/insights-runtime-extractor-q4bdt" Apr 22 21:09:52.170518 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.170481 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b818737b-3647-434e-b75c-fa264c7a4dd2-data-volume\") pod \"insights-runtime-extractor-q4bdt\" (UID: \"b818737b-3647-434e-b75c-fa264c7a4dd2\") " pod="openshift-insights/insights-runtime-extractor-q4bdt" Apr 22 21:09:52.170793 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.170772 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b818737b-3647-434e-b75c-fa264c7a4dd2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q4bdt\" (UID: \"b818737b-3647-434e-b75c-fa264c7a4dd2\") " pod="openshift-insights/insights-runtime-extractor-q4bdt" Apr 22 21:09:52.172796 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.172774 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b818737b-3647-434e-b75c-fa264c7a4dd2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q4bdt\" (UID: \"b818737b-3647-434e-b75c-fa264c7a4dd2\") " pod="openshift-insights/insights-runtime-extractor-q4bdt" Apr 22 21:09:52.178110 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.178090 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4hjj\" (UniqueName: \"kubernetes.io/projected/b818737b-3647-434e-b75c-fa264c7a4dd2-kube-api-access-x4hjj\") pod \"insights-runtime-extractor-q4bdt\" (UID: \"b818737b-3647-434e-b75c-fa264c7a4dd2\") " pod="openshift-insights/insights-runtime-extractor-q4bdt" Apr 22 21:09:52.342752 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.342720 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-q4bdt" Apr 22 21:09:52.465392 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:52.465324 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-q4bdt"] Apr 22 21:09:52.468033 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:52.468001 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb818737b_3647_434e_b75c_fa264c7a4dd2.slice/crio-0adc7f92ff3377985331c06a13f457994e9ac5727896498282e87b0f99ebefe5 WatchSource:0}: Error finding container 0adc7f92ff3377985331c06a13f457994e9ac5727896498282e87b0f99ebefe5: Status 404 returned error can't find the container with id 0adc7f92ff3377985331c06a13f457994e9ac5727896498282e87b0f99ebefe5 Apr 22 21:09:53.031452 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:53.031365 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q4bdt" event={"ID":"b818737b-3647-434e-b75c-fa264c7a4dd2","Type":"ContainerStarted","Data":"85b44500ea35c5b66ef12cff9def5f9d150db23b2f5579d8f7b9b6cc9f50edf6"} Apr 22 21:09:53.031452 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:53.031408 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q4bdt" event={"ID":"b818737b-3647-434e-b75c-fa264c7a4dd2","Type":"ContainerStarted","Data":"0adc7f92ff3377985331c06a13f457994e9ac5727896498282e87b0f99ebefe5"} Apr 22 21:09:54.035385 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:54.035345 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q4bdt" event={"ID":"b818737b-3647-434e-b75c-fa264c7a4dd2","Type":"ContainerStarted","Data":"824c2510eb4eced2f92f9b383ffb65fb2fc8bbf99eeb3a5c6889cc241ba5ea63"} Apr 22 21:09:55.040712 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.040679 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q4bdt" event={"ID":"b818737b-3647-434e-b75c-fa264c7a4dd2","Type":"ContainerStarted","Data":"3d55b15e225054dd91c1fb04461132c45d63cb6000ba903a825ec78ee78dd01d"} Apr 22 21:09:55.056487 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.056448 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-q4bdt" podStartSLOduration=1.669378863 podStartE2EDuration="4.056435199s" podCreationTimestamp="2026-04-22 21:09:51 +0000 UTC" firstStartedPulling="2026-04-22 21:09:52.528326636 +0000 UTC m=+73.437627331" lastFinishedPulling="2026-04-22 21:09:54.915382976 +0000 UTC m=+75.824683667" observedRunningTime="2026-04-22 21:09:55.055500353 +0000 UTC m=+75.964801065" watchObservedRunningTime="2026-04-22 21:09:55.056435199 +0000 UTC m=+75.965735910" Apr 22 21:09:55.287597 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.287517 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-2kkrv"] Apr 22 21:09:55.290775 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.290750 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-2kkrv" Apr 22 21:09:55.292960 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.292933 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 21:09:55.293066 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.292938 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 21:09:55.293663 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.293626 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 21:09:55.293786 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.293675 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-gszwm\"" Apr 22 21:09:55.297737 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.297713 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-2kkrv"] Apr 22 21:09:55.394894 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.394863 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09ad8720-76bf-492a-b6dc-52abf1046e57-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-2kkrv\" (UID: \"09ad8720-76bf-492a-b6dc-52abf1046e57\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2kkrv" Apr 22 21:09:55.395050 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.394908 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/09ad8720-76bf-492a-b6dc-52abf1046e57-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-2kkrv\" (UID: \"09ad8720-76bf-492a-b6dc-52abf1046e57\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2kkrv" Apr 22 21:09:55.395050 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.394937 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btw8w\" (UniqueName: \"kubernetes.io/projected/09ad8720-76bf-492a-b6dc-52abf1046e57-kube-api-access-btw8w\") pod \"prometheus-operator-5676c8c784-2kkrv\" (UID: \"09ad8720-76bf-492a-b6dc-52abf1046e57\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2kkrv" Apr 22 21:09:55.395050 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.394968 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09ad8720-76bf-492a-b6dc-52abf1046e57-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-2kkrv\" (UID: \"09ad8720-76bf-492a-b6dc-52abf1046e57\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2kkrv" Apr 22 21:09:55.495688 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.495639 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09ad8720-76bf-492a-b6dc-52abf1046e57-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-2kkrv\" (UID: \"09ad8720-76bf-492a-b6dc-52abf1046e57\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2kkrv" Apr 22 21:09:55.495824 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.495707 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/09ad8720-76bf-492a-b6dc-52abf1046e57-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-2kkrv\" (UID: \"09ad8720-76bf-492a-b6dc-52abf1046e57\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2kkrv" Apr 22 21:09:55.495824 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.495732 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btw8w\" (UniqueName: \"kubernetes.io/projected/09ad8720-76bf-492a-b6dc-52abf1046e57-kube-api-access-btw8w\") pod \"prometheus-operator-5676c8c784-2kkrv\" (UID: \"09ad8720-76bf-492a-b6dc-52abf1046e57\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2kkrv" Apr 22 21:09:55.495824 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.495767 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09ad8720-76bf-492a-b6dc-52abf1046e57-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-2kkrv\" (UID: \"09ad8720-76bf-492a-b6dc-52abf1046e57\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2kkrv" Apr 22 21:09:55.496289 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.496271 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09ad8720-76bf-492a-b6dc-52abf1046e57-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-2kkrv\" (UID: \"09ad8720-76bf-492a-b6dc-52abf1046e57\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2kkrv" Apr 22 21:09:55.498406 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.498383 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/09ad8720-76bf-492a-b6dc-52abf1046e57-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-2kkrv\" (UID: \"09ad8720-76bf-492a-b6dc-52abf1046e57\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2kkrv" Apr 22 21:09:55.498456 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.498426 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09ad8720-76bf-492a-b6dc-52abf1046e57-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-2kkrv\" (UID: \"09ad8720-76bf-492a-b6dc-52abf1046e57\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2kkrv" Apr 22 21:09:55.506770 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.506740 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btw8w\" (UniqueName: \"kubernetes.io/projected/09ad8720-76bf-492a-b6dc-52abf1046e57-kube-api-access-btw8w\") pod \"prometheus-operator-5676c8c784-2kkrv\" (UID: \"09ad8720-76bf-492a-b6dc-52abf1046e57\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2kkrv" Apr 22 21:09:55.600767 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.600733 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-2kkrv" Apr 22 21:09:55.724741 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.724671 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-2kkrv"] Apr 22 21:09:55.727198 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:55.727163 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09ad8720_76bf_492a_b6dc_52abf1046e57.slice/crio-ea060cfa486a32787f4b871143d7a551cb8347eb4bf19eefb4ec6dfea6897882 WatchSource:0}: Error finding container ea060cfa486a32787f4b871143d7a551cb8347eb4bf19eefb4ec6dfea6897882: Status 404 returned error can't find the container with id ea060cfa486a32787f4b871143d7a551cb8347eb4bf19eefb4ec6dfea6897882 Apr 22 21:09:55.913376 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:55.913294 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hm4qp" Apr 22 21:09:56.044801 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:56.044760 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-2kkrv" event={"ID":"09ad8720-76bf-492a-b6dc-52abf1046e57","Type":"ContainerStarted","Data":"ea060cfa486a32787f4b871143d7a551cb8347eb4bf19eefb4ec6dfea6897882"} Apr 22 21:09:58.051436 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:58.051399 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-2kkrv" event={"ID":"09ad8720-76bf-492a-b6dc-52abf1046e57","Type":"ContainerStarted","Data":"83a2e01aa86e7a2ab82a1b8e20895e5982c2ded061f9865cc83d1f022a28bb0c"} Apr 22 21:09:58.051436 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:58.051438 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-2kkrv" event={"ID":"09ad8720-76bf-492a-b6dc-52abf1046e57","Type":"ContainerStarted","Data":"62efd1aa5912a7397aedf304defd50516389da4d46c922ea5591d7381675be7b"} Apr 22 21:09:58.066111 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:58.066065 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-2kkrv" podStartSLOduration=1.7963450399999998 podStartE2EDuration="3.06604871s" podCreationTimestamp="2026-04-22 21:09:55 +0000 UTC" firstStartedPulling="2026-04-22 21:09:55.729110206 +0000 UTC m=+76.638410899" lastFinishedPulling="2026-04-22 21:09:56.998813879 +0000 UTC m=+77.908114569" observedRunningTime="2026-04-22 21:09:58.065190853 +0000 UTC m=+78.974491563" watchObservedRunningTime="2026-04-22 21:09:58.06604871 +0000 UTC m=+78.975349421" Apr 22 21:09:59.641167 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.641129 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9xnmz"] Apr 22 21:09:59.645970 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.645943 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.648044 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.648022 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 21:09:59.648257 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.648036 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7bz2b\"" Apr 22 21:09:59.648325 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.648139 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 21:09:59.648377 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.648154 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 21:09:59.732547 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.732503 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fa0a2fd2-a715-4e30-823a-cabc466c9686-sys\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.732714 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.732558 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa0a2fd2-a715-4e30-823a-cabc466c9686-metrics-client-ca\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.732714 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.732629 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pfrp\" (UniqueName: \"kubernetes.io/projected/fa0a2fd2-a715-4e30-823a-cabc466c9686-kube-api-access-4pfrp\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.732822 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.732714 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fa0a2fd2-a715-4e30-823a-cabc466c9686-node-exporter-tls\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.732822 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.732748 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fa0a2fd2-a715-4e30-823a-cabc466c9686-node-exporter-accelerators-collector-config\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.732822 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.732784 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fa0a2fd2-a715-4e30-823a-cabc466c9686-node-exporter-textfile\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.733005 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.732835 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fa0a2fd2-a715-4e30-823a-cabc466c9686-root\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.733005 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.732868 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fa0a2fd2-a715-4e30-823a-cabc466c9686-node-exporter-wtmp\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.733005 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.732906 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fa0a2fd2-a715-4e30-823a-cabc466c9686-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.834036 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.834005 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa0a2fd2-a715-4e30-823a-cabc466c9686-metrics-client-ca\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.834036 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.834049 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pfrp\" (UniqueName: \"kubernetes.io/projected/fa0a2fd2-a715-4e30-823a-cabc466c9686-kube-api-access-4pfrp\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.834272 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.834186 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fa0a2fd2-a715-4e30-823a-cabc466c9686-node-exporter-tls\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.834272 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.834232 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fa0a2fd2-a715-4e30-823a-cabc466c9686-node-exporter-accelerators-collector-config\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.834272 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.834268 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fa0a2fd2-a715-4e30-823a-cabc466c9686-node-exporter-textfile\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.834427 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.834322 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fa0a2fd2-a715-4e30-823a-cabc466c9686-root\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.834427 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.834355 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fa0a2fd2-a715-4e30-823a-cabc466c9686-node-exporter-wtmp\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.834427 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.834394 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fa0a2fd2-a715-4e30-823a-cabc466c9686-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.834571 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.834463 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fa0a2fd2-a715-4e30-823a-cabc466c9686-sys\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.834571 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.834550 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fa0a2fd2-a715-4e30-823a-cabc466c9686-sys\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.834706 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.834599 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fa0a2fd2-a715-4e30-823a-cabc466c9686-root\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.834706 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.834676 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fa0a2fd2-a715-4e30-823a-cabc466c9686-node-exporter-textfile\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.834805 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.834727 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fa0a2fd2-a715-4e30-823a-cabc466c9686-node-exporter-wtmp\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.834938 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.834913 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fa0a2fd2-a715-4e30-823a-cabc466c9686-node-exporter-accelerators-collector-config\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.835050 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.834951 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa0a2fd2-a715-4e30-823a-cabc466c9686-metrics-client-ca\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.837028 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.837004 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fa0a2fd2-a715-4e30-823a-cabc466c9686-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.837414 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.837393 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fa0a2fd2-a715-4e30-823a-cabc466c9686-node-exporter-tls\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.844569 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.844545 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pfrp\" (UniqueName: \"kubernetes.io/projected/fa0a2fd2-a715-4e30-823a-cabc466c9686-kube-api-access-4pfrp\") pod \"node-exporter-9xnmz\" (UID: \"fa0a2fd2-a715-4e30-823a-cabc466c9686\") " pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.958347 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:09:59.958270 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9xnmz" Apr 22 21:09:59.968400 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:09:59.968369 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa0a2fd2_a715_4e30_823a_cabc466c9686.slice/crio-bada426470ff82ac93736a5846815b2f77f769e3aa1fdb42d864b1abd97be7d3 WatchSource:0}: Error finding container bada426470ff82ac93736a5846815b2f77f769e3aa1fdb42d864b1abd97be7d3: Status 404 returned error can't find the container with id bada426470ff82ac93736a5846815b2f77f769e3aa1fdb42d864b1abd97be7d3 Apr 22 21:10:00.058123 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:00.058085 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9xnmz" event={"ID":"fa0a2fd2-a715-4e30-823a-cabc466c9686","Type":"ContainerStarted","Data":"bada426470ff82ac93736a5846815b2f77f769e3aa1fdb42d864b1abd97be7d3"} Apr 22 21:10:01.032247 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:01.032156 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-264s7" Apr 22 21:10:01.063935 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:01.063900 2546 generic.go:358] "Generic (PLEG): container finished" podID="fa0a2fd2-a715-4e30-823a-cabc466c9686" containerID="324284c45ce0f734f2ea03e1361faa7005323574503e9c3fd8f13b7dc2c04fdd" exitCode=0 Apr 22 21:10:01.064094 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:01.064009 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9xnmz" event={"ID":"fa0a2fd2-a715-4e30-823a-cabc466c9686","Type":"ContainerDied","Data":"324284c45ce0f734f2ea03e1361faa7005323574503e9c3fd8f13b7dc2c04fdd"} Apr 22 21:10:02.071289 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:02.071249 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9xnmz" event={"ID":"fa0a2fd2-a715-4e30-823a-cabc466c9686","Type":"ContainerStarted","Data":"b43880fbf86ce491957928c89d6e1363f5499440cd5aaa1e9de040855b4d3260"} Apr 22 21:10:02.071289 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:02.071301 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9xnmz" event={"ID":"fa0a2fd2-a715-4e30-823a-cabc466c9686","Type":"ContainerStarted","Data":"63fc45866106bc601a95cbb6aeded3d000b3cf89339ece9d3f8fc28d8f7ce1f0"} Apr 22 21:10:02.089017 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:02.088956 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9xnmz" podStartSLOduration=2.305546133 podStartE2EDuration="3.088942426s" podCreationTimestamp="2026-04-22 21:09:59 +0000 UTC" firstStartedPulling="2026-04-22 21:09:59.970305104 +0000 UTC m=+80.879605792" lastFinishedPulling="2026-04-22 21:10:00.753701367 +0000 UTC m=+81.663002085" observedRunningTime="2026-04-22 21:10:02.087383339 +0000 UTC m=+82.996684047" watchObservedRunningTime="2026-04-22 21:10:02.088942426 +0000 UTC m=+82.998243137" Apr 22 21:10:04.553797 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:04.553757 2546 patch_prober.go:28] interesting pod/image-registry-8576cd8d8-tvm8d container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 21:10:04.554226 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:04.553818 2546 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" podUID="aae8c028-d54c-4e48-8065-01b3b3bbc0b0" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 21:10:05.989835 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:05.989804 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:10:08.860388 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.860349 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6cf77bdb79-cz699"] Apr 22 21:10:08.864795 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.864777 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:08.867797 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.867759 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 21:10:08.867797 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.867784 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-pg8x7\"" Apr 22 21:10:08.867972 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.867767 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 21:10:08.867972 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.867832 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 21:10:08.868077 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.868051 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 21:10:08.868077 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.868059 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 21:10:08.868172 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.868150 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 21:10:08.868360 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.868346 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 21:10:08.873126 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.873103 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 21:10:08.873322 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.873300 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cf77bdb79-cz699"] Apr 22 21:10:08.912196 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.912162 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-service-ca\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:08.912334 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.912200 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/72b7126b-c54c-4c10-bf2f-422b93d1cae9-console-serving-cert\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:08.912334 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.912275 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-console-config\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:08.912404 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.912340 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-trusted-ca-bundle\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:08.912404 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.912370 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/72b7126b-c54c-4c10-bf2f-422b93d1cae9-console-oauth-config\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:08.912404 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.912402 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-oauth-serving-cert\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:08.912495 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:08.912421 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj77p\" (UniqueName: \"kubernetes.io/projected/72b7126b-c54c-4c10-bf2f-422b93d1cae9-kube-api-access-bj77p\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:09.012845 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:09.012808 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/72b7126b-c54c-4c10-bf2f-422b93d1cae9-console-serving-cert\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:09.013017 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:09.012878 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-console-config\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:09.013017 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:09.012912 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-trusted-ca-bundle\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:09.013017 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:09.012950 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/72b7126b-c54c-4c10-bf2f-422b93d1cae9-console-oauth-config\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:09.013017 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:09.012991 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-oauth-serving-cert\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:09.013017 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:09.013016 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bj77p\" (UniqueName: \"kubernetes.io/projected/72b7126b-c54c-4c10-bf2f-422b93d1cae9-kube-api-access-bj77p\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:09.013451 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:09.013418 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-service-ca\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:09.014091 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:09.014064 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-console-config\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:09.014426 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:09.014207 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-oauth-serving-cert\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:09.014544 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:09.014217 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-trusted-ca-bundle\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:09.014683 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:09.014634 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-service-ca\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:09.016395 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:09.016364 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/72b7126b-c54c-4c10-bf2f-422b93d1cae9-console-serving-cert\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:09.016503 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:09.016466 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/72b7126b-c54c-4c10-bf2f-422b93d1cae9-console-oauth-config\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:09.020962 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:09.020942 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj77p\" (UniqueName: \"kubernetes.io/projected/72b7126b-c54c-4c10-bf2f-422b93d1cae9-kube-api-access-bj77p\") pod \"console-6cf77bdb79-cz699\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:09.175515 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:09.175426 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:09.299946 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:09.299778 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cf77bdb79-cz699"] Apr 22 21:10:09.302543 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:10:09.302513 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72b7126b_c54c_4c10_bf2f_422b93d1cae9.slice/crio-c57779db4ad32e1b8af730b49336fc6231c89a91c85279c1732363fc144dc480 WatchSource:0}: Error finding container c57779db4ad32e1b8af730b49336fc6231c89a91c85279c1732363fc144dc480: Status 404 returned error can't find the container with id c57779db4ad32e1b8af730b49336fc6231c89a91c85279c1732363fc144dc480 Apr 22 21:10:10.098303 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:10.098221 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cf77bdb79-cz699" event={"ID":"72b7126b-c54c-4c10-bf2f-422b93d1cae9","Type":"ContainerStarted","Data":"c57779db4ad32e1b8af730b49336fc6231c89a91c85279c1732363fc144dc480"} Apr 22 21:10:13.109371 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:13.109328 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cf77bdb79-cz699" event={"ID":"72b7126b-c54c-4c10-bf2f-422b93d1cae9","Type":"ContainerStarted","Data":"216ef60cca3c507be11c7cafebd341a71fbd5957be1a8c2a277d982604b9cc22"} Apr 22 21:10:13.125289 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:13.125239 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6cf77bdb79-cz699" podStartSLOduration=2.3392015170000002 podStartE2EDuration="5.125224476s" podCreationTimestamp="2026-04-22 21:10:08 +0000 UTC" firstStartedPulling="2026-04-22 21:10:09.304675394 +0000 UTC m=+90.213976085" lastFinishedPulling="2026-04-22 21:10:12.090698354 +0000 UTC m=+92.999999044" observedRunningTime="2026-04-22 21:10:13.124090894 +0000 UTC m=+94.033391604" watchObservedRunningTime="2026-04-22 21:10:13.125224476 +0000 UTC m=+94.034525186" Apr 22 21:10:13.819628 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:13.819594 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8576cd8d8-tvm8d"] Apr 22 21:10:16.817593 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:16.817555 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cf77bdb79-cz699"] Apr 22 21:10:19.175836 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:19.175802 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:34.169745 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:34.169710 2546 generic.go:358] "Generic (PLEG): container finished" podID="dd3b0638-5a78-4ca8-8274-4653bf8b98d0" containerID="2f6660720f045e31a7cfd859faa5ad4973bc96d50149a07acc13f9cdb8c1049c" exitCode=0 Apr 22 21:10:34.170147 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:34.169777 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-44nf9" event={"ID":"dd3b0638-5a78-4ca8-8274-4653bf8b98d0","Type":"ContainerDied","Data":"2f6660720f045e31a7cfd859faa5ad4973bc96d50149a07acc13f9cdb8c1049c"} Apr 22 21:10:34.170189 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:34.170151 2546 scope.go:117] "RemoveContainer" containerID="2f6660720f045e31a7cfd859faa5ad4973bc96d50149a07acc13f9cdb8c1049c" Apr 22 21:10:35.174509 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:35.174460 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-44nf9" event={"ID":"dd3b0638-5a78-4ca8-8274-4653bf8b98d0","Type":"ContainerStarted","Data":"b4ac47d303e5aa1aedb2b22123a4d661a37f0cc6b6c54b4a7aee62e14dc69ecf"} Apr 22 21:10:38.838871 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:38.838827 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" podUID="aae8c028-d54c-4e48-8065-01b3b3bbc0b0" containerName="registry" containerID="cri-o://8f1ee2cf48d29d5170528137615f9e8cdc52f5e8080e8aa19ce0162a547b344e" gracePeriod=30 Apr 22 21:10:39.082560 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.082537 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:10:39.189811 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.189718 2546 generic.go:358] "Generic (PLEG): container finished" podID="aae8c028-d54c-4e48-8065-01b3b3bbc0b0" containerID="8f1ee2cf48d29d5170528137615f9e8cdc52f5e8080e8aa19ce0162a547b344e" exitCode=0 Apr 22 21:10:39.189811 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.189761 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" event={"ID":"aae8c028-d54c-4e48-8065-01b3b3bbc0b0","Type":"ContainerDied","Data":"8f1ee2cf48d29d5170528137615f9e8cdc52f5e8080e8aa19ce0162a547b344e"} Apr 22 21:10:39.189811 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.189783 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" event={"ID":"aae8c028-d54c-4e48-8065-01b3b3bbc0b0","Type":"ContainerDied","Data":"1f6110270096e468bb9538bd15875595f01152841aca98124eea69d56d9b694c"} Apr 22 21:10:39.189811 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.189789 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8576cd8d8-tvm8d" Apr 22 21:10:39.189811 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.189798 2546 scope.go:117] "RemoveContainer" containerID="8f1ee2cf48d29d5170528137615f9e8cdc52f5e8080e8aa19ce0162a547b344e" Apr 22 21:10:39.197595 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.197573 2546 scope.go:117] "RemoveContainer" containerID="8f1ee2cf48d29d5170528137615f9e8cdc52f5e8080e8aa19ce0162a547b344e" Apr 22 21:10:39.197851 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:10:39.197832 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f1ee2cf48d29d5170528137615f9e8cdc52f5e8080e8aa19ce0162a547b344e\": container with ID starting with 8f1ee2cf48d29d5170528137615f9e8cdc52f5e8080e8aa19ce0162a547b344e not found: ID does not exist" containerID="8f1ee2cf48d29d5170528137615f9e8cdc52f5e8080e8aa19ce0162a547b344e" Apr 22 21:10:39.197900 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.197866 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f1ee2cf48d29d5170528137615f9e8cdc52f5e8080e8aa19ce0162a547b344e"} err="failed to get container status \"8f1ee2cf48d29d5170528137615f9e8cdc52f5e8080e8aa19ce0162a547b344e\": rpc error: code = NotFound desc = could not find container \"8f1ee2cf48d29d5170528137615f9e8cdc52f5e8080e8aa19ce0162a547b344e\": container with ID starting with 8f1ee2cf48d29d5170528137615f9e8cdc52f5e8080e8aa19ce0162a547b344e not found: ID does not exist" Apr 22 21:10:39.277557 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.277528 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-bound-sa-token\") pod \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " Apr 22 21:10:39.277736 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.277579 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-installation-pull-secrets\") pod \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " Apr 22 21:10:39.277736 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.277605 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-certificates\") pod \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " Apr 22 21:10:39.277736 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.277675 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls\") pod \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " Apr 22 21:10:39.277736 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.277707 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjsjc\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-kube-api-access-vjsjc\") pod \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " Apr 22 21:10:39.277736 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.277728 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-ca-trust-extracted\") pod \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " Apr 22 21:10:39.278398 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.277929 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-trusted-ca\") pod \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " Apr 22 21:10:39.278398 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.278029 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-image-registry-private-configuration\") pod \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\" (UID: \"aae8c028-d54c-4e48-8065-01b3b3bbc0b0\") " Apr 22 21:10:39.278398 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.278081 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "aae8c028-d54c-4e48-8065-01b3b3bbc0b0" (UID: "aae8c028-d54c-4e48-8065-01b3b3bbc0b0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:10:39.278398 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.278282 2546 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-certificates\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:10:39.278765 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.278406 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "aae8c028-d54c-4e48-8065-01b3b3bbc0b0" (UID: "aae8c028-d54c-4e48-8065-01b3b3bbc0b0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:10:39.280465 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.280410 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "aae8c028-d54c-4e48-8065-01b3b3bbc0b0" (UID: "aae8c028-d54c-4e48-8065-01b3b3bbc0b0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:10:39.280569 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.280506 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "aae8c028-d54c-4e48-8065-01b3b3bbc0b0" (UID: "aae8c028-d54c-4e48-8065-01b3b3bbc0b0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:39.280739 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.280711 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-kube-api-access-vjsjc" (OuterVolumeSpecName: "kube-api-access-vjsjc") pod "aae8c028-d54c-4e48-8065-01b3b3bbc0b0" (UID: "aae8c028-d54c-4e48-8065-01b3b3bbc0b0"). InnerVolumeSpecName "kube-api-access-vjsjc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:10:39.280815 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.280752 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "aae8c028-d54c-4e48-8065-01b3b3bbc0b0" (UID: "aae8c028-d54c-4e48-8065-01b3b3bbc0b0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:10:39.280867 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.280823 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "aae8c028-d54c-4e48-8065-01b3b3bbc0b0" (UID: "aae8c028-d54c-4e48-8065-01b3b3bbc0b0"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:39.286955 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.286933 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "aae8c028-d54c-4e48-8065-01b3b3bbc0b0" (UID: "aae8c028-d54c-4e48-8065-01b3b3bbc0b0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:10:39.378817 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.378788 2546 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-bound-sa-token\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:10:39.378817 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.378814 2546 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-installation-pull-secrets\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:10:39.378817 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.378824 2546 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-registry-tls\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:10:39.379019 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.378833 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vjsjc\" (UniqueName: \"kubernetes.io/projected/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-kube-api-access-vjsjc\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:10:39.379019 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.378842 2546 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-ca-trust-extracted\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:10:39.379019 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.378852 2546 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-trusted-ca\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:10:39.379019 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.378861 2546 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aae8c028-d54c-4e48-8065-01b3b3bbc0b0-image-registry-private-configuration\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:10:39.509960 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.509927 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8576cd8d8-tvm8d"] Apr 22 21:10:39.512926 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.512900 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-8576cd8d8-tvm8d"] Apr 22 21:10:39.680265 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:39.680225 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae8c028-d54c-4e48-8065-01b3b3bbc0b0" path="/var/lib/kubelet/pods/aae8c028-d54c-4e48-8065-01b3b3bbc0b0/volumes" Apr 22 21:10:41.838538 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:41.838475 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6cf77bdb79-cz699" podUID="72b7126b-c54c-4c10-bf2f-422b93d1cae9" containerName="console" containerID="cri-o://216ef60cca3c507be11c7cafebd341a71fbd5957be1a8c2a277d982604b9cc22" gracePeriod=15 Apr 22 21:10:42.075553 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.075531 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cf77bdb79-cz699_72b7126b-c54c-4c10-bf2f-422b93d1cae9/console/0.log" Apr 22 21:10:42.075682 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.075596 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:42.099828 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.099741 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj77p\" (UniqueName: \"kubernetes.io/projected/72b7126b-c54c-4c10-bf2f-422b93d1cae9-kube-api-access-bj77p\") pod \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " Apr 22 21:10:42.099828 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.099799 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/72b7126b-c54c-4c10-bf2f-422b93d1cae9-console-oauth-config\") pod \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " Apr 22 21:10:42.100030 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.099836 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-oauth-serving-cert\") pod \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " Apr 22 21:10:42.100030 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.099870 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-trusted-ca-bundle\") pod \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " Apr 22 21:10:42.100030 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.099899 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-service-ca\") pod \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " Apr 22 21:10:42.100030 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.099935 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-console-config\") pod \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " Apr 22 21:10:42.100030 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.099964 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/72b7126b-c54c-4c10-bf2f-422b93d1cae9-console-serving-cert\") pod \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\" (UID: \"72b7126b-c54c-4c10-bf2f-422b93d1cae9\") " Apr 22 21:10:42.100588 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.100373 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "72b7126b-c54c-4c10-bf2f-422b93d1cae9" (UID: "72b7126b-c54c-4c10-bf2f-422b93d1cae9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:10:42.100588 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.100531 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-service-ca" (OuterVolumeSpecName: "service-ca") pod "72b7126b-c54c-4c10-bf2f-422b93d1cae9" (UID: "72b7126b-c54c-4c10-bf2f-422b93d1cae9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:10:42.100588 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.100542 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "72b7126b-c54c-4c10-bf2f-422b93d1cae9" (UID: "72b7126b-c54c-4c10-bf2f-422b93d1cae9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:10:42.100588 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.100564 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-console-config" (OuterVolumeSpecName: "console-config") pod "72b7126b-c54c-4c10-bf2f-422b93d1cae9" (UID: "72b7126b-c54c-4c10-bf2f-422b93d1cae9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:10:42.102508 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.102477 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b7126b-c54c-4c10-bf2f-422b93d1cae9-kube-api-access-bj77p" (OuterVolumeSpecName: "kube-api-access-bj77p") pod "72b7126b-c54c-4c10-bf2f-422b93d1cae9" (UID: "72b7126b-c54c-4c10-bf2f-422b93d1cae9"). InnerVolumeSpecName "kube-api-access-bj77p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:10:42.102877 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.102852 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b7126b-c54c-4c10-bf2f-422b93d1cae9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "72b7126b-c54c-4c10-bf2f-422b93d1cae9" (UID: "72b7126b-c54c-4c10-bf2f-422b93d1cae9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:42.102877 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.102863 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b7126b-c54c-4c10-bf2f-422b93d1cae9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "72b7126b-c54c-4c10-bf2f-422b93d1cae9" (UID: "72b7126b-c54c-4c10-bf2f-422b93d1cae9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:42.199370 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.199346 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cf77bdb79-cz699_72b7126b-c54c-4c10-bf2f-422b93d1cae9/console/0.log" Apr 22 21:10:42.199517 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.199388 2546 generic.go:358] "Generic (PLEG): container finished" podID="72b7126b-c54c-4c10-bf2f-422b93d1cae9" containerID="216ef60cca3c507be11c7cafebd341a71fbd5957be1a8c2a277d982604b9cc22" exitCode=2 Apr 22 21:10:42.199517 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.199451 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cf77bdb79-cz699" event={"ID":"72b7126b-c54c-4c10-bf2f-422b93d1cae9","Type":"ContainerDied","Data":"216ef60cca3c507be11c7cafebd341a71fbd5957be1a8c2a277d982604b9cc22"} Apr 22 21:10:42.199517 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.199491 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cf77bdb79-cz699" event={"ID":"72b7126b-c54c-4c10-bf2f-422b93d1cae9","Type":"ContainerDied","Data":"c57779db4ad32e1b8af730b49336fc6231c89a91c85279c1732363fc144dc480"} Apr 22 21:10:42.199517 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.199507 2546 scope.go:117] "RemoveContainer" containerID="216ef60cca3c507be11c7cafebd341a71fbd5957be1a8c2a277d982604b9cc22" Apr 22 21:10:42.199697 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.199455 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cf77bdb79-cz699" Apr 22 21:10:42.200489 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.200460 2546 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-oauth-serving-cert\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:10:42.200489 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.200487 2546 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-trusted-ca-bundle\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:10:42.200624 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.200501 2546 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-service-ca\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:10:42.200624 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.200517 2546 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/72b7126b-c54c-4c10-bf2f-422b93d1cae9-console-config\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:10:42.200624 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.200528 2546 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/72b7126b-c54c-4c10-bf2f-422b93d1cae9-console-serving-cert\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:10:42.200624 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.200543 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bj77p\" (UniqueName: \"kubernetes.io/projected/72b7126b-c54c-4c10-bf2f-422b93d1cae9-kube-api-access-bj77p\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:10:42.200624 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.200555 2546 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/72b7126b-c54c-4c10-bf2f-422b93d1cae9-console-oauth-config\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:10:42.208095 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.208080 2546 scope.go:117] "RemoveContainer" containerID="216ef60cca3c507be11c7cafebd341a71fbd5957be1a8c2a277d982604b9cc22" Apr 22 21:10:42.208351 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:10:42.208329 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"216ef60cca3c507be11c7cafebd341a71fbd5957be1a8c2a277d982604b9cc22\": container with ID starting with 216ef60cca3c507be11c7cafebd341a71fbd5957be1a8c2a277d982604b9cc22 not found: ID does not exist" containerID="216ef60cca3c507be11c7cafebd341a71fbd5957be1a8c2a277d982604b9cc22" Apr 22 21:10:42.208413 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.208363 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"216ef60cca3c507be11c7cafebd341a71fbd5957be1a8c2a277d982604b9cc22"} err="failed to get container status \"216ef60cca3c507be11c7cafebd341a71fbd5957be1a8c2a277d982604b9cc22\": rpc error: code = NotFound desc = could not find container \"216ef60cca3c507be11c7cafebd341a71fbd5957be1a8c2a277d982604b9cc22\": container with ID starting with 216ef60cca3c507be11c7cafebd341a71fbd5957be1a8c2a277d982604b9cc22 not found: ID does not exist" Apr 22 21:10:42.220490 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.220461 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cf77bdb79-cz699"] Apr 22 21:10:42.223947 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:42.223919 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6cf77bdb79-cz699"] Apr 22 21:10:43.679600 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:43.679557 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72b7126b-c54c-4c10-bf2f-422b93d1cae9" path="/var/lib/kubelet/pods/72b7126b-c54c-4c10-bf2f-422b93d1cae9/volumes" Apr 22 21:10:45.112789 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.112754 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7997c4447c-njt5q"] Apr 22 21:10:45.113284 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.113204 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72b7126b-c54c-4c10-bf2f-422b93d1cae9" containerName="console" Apr 22 21:10:45.113284 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.113225 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b7126b-c54c-4c10-bf2f-422b93d1cae9" containerName="console" Apr 22 21:10:45.113284 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.113253 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aae8c028-d54c-4e48-8065-01b3b3bbc0b0" containerName="registry" Apr 22 21:10:45.113284 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.113261 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae8c028-d54c-4e48-8065-01b3b3bbc0b0" containerName="registry" Apr 22 21:10:45.113483 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.113336 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="72b7126b-c54c-4c10-bf2f-422b93d1cae9" containerName="console" Apr 22 21:10:45.113483 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.113352 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="aae8c028-d54c-4e48-8065-01b3b3bbc0b0" containerName="registry" Apr 22 21:10:45.116390 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.116362 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.120246 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.119727 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 21:10:45.120246 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.119826 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 21:10:45.120246 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.119727 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 21:10:45.120246 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.119963 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 21:10:45.120246 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.120082 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 21:10:45.120246 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.120100 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-pg8x7\"" Apr 22 21:10:45.120246 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.120100 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 21:10:45.120246 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.120128 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5994f97-83e3-4ca4-a33b-0c478db407cf-console-serving-cert\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.120246 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.120161 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-service-ca\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.120246 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.120240 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-console-config\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.120246 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.120266 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 21:10:45.120246 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.120278 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-oauth-serving-cert\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.120964 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.120305 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5994f97-83e3-4ca4-a33b-0c478db407cf-console-oauth-config\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.120964 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.120327 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-trusted-ca-bundle\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.120964 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.120352 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzhqf\" (UniqueName: \"kubernetes.io/projected/d5994f97-83e3-4ca4-a33b-0c478db407cf-kube-api-access-wzhqf\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.125600 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.125567 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 21:10:45.131671 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.129053 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7997c4447c-njt5q"] Apr 22 21:10:45.212052 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.212015 2546 generic.go:358] "Generic (PLEG): container finished" podID="2d34322a-f3f9-4e65-8714-cec36115f27e" containerID="9bb1f0ea925dfe635fd77bcc648268481debfa0fe4d2dca7723798780c561a99" exitCode=0 Apr 22 21:10:45.212203 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.212039 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg" event={"ID":"2d34322a-f3f9-4e65-8714-cec36115f27e","Type":"ContainerDied","Data":"9bb1f0ea925dfe635fd77bcc648268481debfa0fe4d2dca7723798780c561a99"} Apr 22 21:10:45.212430 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.212414 2546 scope.go:117] "RemoveContainer" containerID="9bb1f0ea925dfe635fd77bcc648268481debfa0fe4d2dca7723798780c561a99" Apr 22 21:10:45.221295 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.221274 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-console-config\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.221469 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.221449 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-oauth-serving-cert\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.221598 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.221582 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5994f97-83e3-4ca4-a33b-0c478db407cf-console-oauth-config\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.221758 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.221740 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-trusted-ca-bundle\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.221890 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.221874 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzhqf\" (UniqueName: \"kubernetes.io/projected/d5994f97-83e3-4ca4-a33b-0c478db407cf-kube-api-access-wzhqf\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.222084 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.222064 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5994f97-83e3-4ca4-a33b-0c478db407cf-console-serving-cert\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.222222 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.222205 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-service-ca\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.222803 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.222696 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-trusted-ca-bundle\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.222803 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.222317 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-console-config\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.222959 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.222849 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-service-ca\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.222959 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.222865 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-oauth-serving-cert\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.225144 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.225121 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5994f97-83e3-4ca4-a33b-0c478db407cf-console-serving-cert\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.225354 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.225332 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5994f97-83e3-4ca4-a33b-0c478db407cf-console-oauth-config\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.236388 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.236358 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzhqf\" (UniqueName: \"kubernetes.io/projected/d5994f97-83e3-4ca4-a33b-0c478db407cf-kube-api-access-wzhqf\") pod \"console-7997c4447c-njt5q\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.435525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.435429 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:45.579134 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:45.579102 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7997c4447c-njt5q"] Apr 22 21:10:45.581720 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:10:45.581677 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5994f97_83e3_4ca4_a33b_0c478db407cf.slice/crio-8111a72237871f5f805cae20063e55929dc6b5aa0de592bf173312c099d7a027 WatchSource:0}: Error finding container 8111a72237871f5f805cae20063e55929dc6b5aa0de592bf173312c099d7a027: Status 404 returned error can't find the container with id 8111a72237871f5f805cae20063e55929dc6b5aa0de592bf173312c099d7a027 Apr 22 21:10:46.216945 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:46.216909 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-567gg" event={"ID":"2d34322a-f3f9-4e65-8714-cec36115f27e","Type":"ContainerStarted","Data":"2367c1f5360c054f37b9bade1cfe57ce7e0a3246ae0ec08386a1e273ba8b6a96"} Apr 22 21:10:46.218330 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:46.218307 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7997c4447c-njt5q" event={"ID":"d5994f97-83e3-4ca4-a33b-0c478db407cf","Type":"ContainerStarted","Data":"236acee0d1c0f718b313e736724aa79ccaafb998fa128024110eba453a83a13b"} Apr 22 21:10:46.218452 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:46.218335 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7997c4447c-njt5q" event={"ID":"d5994f97-83e3-4ca4-a33b-0c478db407cf","Type":"ContainerStarted","Data":"8111a72237871f5f805cae20063e55929dc6b5aa0de592bf173312c099d7a027"} Apr 22 21:10:46.246926 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:46.246861 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7997c4447c-njt5q" podStartSLOduration=1.246824626 podStartE2EDuration="1.246824626s" podCreationTimestamp="2026-04-22 21:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:10:46.245316028 +0000 UTC m=+127.154616740" watchObservedRunningTime="2026-04-22 21:10:46.246824626 +0000 UTC m=+127.156125339" Apr 22 21:10:50.231762 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:50.231725 2546 generic.go:358] "Generic (PLEG): container finished" podID="4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347" containerID="8f4a31cf4d0f875a7b12dc17f937a8c18c6b4f0fbf2acfee4e65ae1ce0dd2d0a" exitCode=0 Apr 22 21:10:50.232131 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:50.231798 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6" event={"ID":"4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347","Type":"ContainerDied","Data":"8f4a31cf4d0f875a7b12dc17f937a8c18c6b4f0fbf2acfee4e65ae1ce0dd2d0a"} Apr 22 21:10:50.232200 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:50.232181 2546 scope.go:117] "RemoveContainer" containerID="8f4a31cf4d0f875a7b12dc17f937a8c18c6b4f0fbf2acfee4e65ae1ce0dd2d0a" Apr 22 21:10:51.235880 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:51.235843 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-b4sm6" event={"ID":"4d6941f8-9b27-4ee7-b48d-7b1ba8e5a347","Type":"ContainerStarted","Data":"303446a61d1e08ee5b925c5b3694c1b89278a7ff23f8499a8f7a4bb89812aa2e"} Apr 22 21:10:55.435692 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:55.435628 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:55.435692 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:55.435699 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:55.440582 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:55.440558 2546 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:10:56.254759 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:10:56.254722 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:12:01.839891 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:01.839855 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-vtssh"] Apr 22 21:12:01.843069 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:01.843052 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-vtssh" Apr 22 21:12:01.845423 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:01.845404 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 21:12:01.845513 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:01.845442 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-bs8t8\"" Apr 22 21:12:01.845885 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:01.845870 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 21:12:01.850733 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:01.850714 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-vtssh"] Apr 22 21:12:01.937798 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:01.937762 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpvnq\" (UniqueName: \"kubernetes.io/projected/c3ae2aa2-23e1-4b6f-b892-b0dd07e8fe5a-kube-api-access-qpvnq\") pod \"cert-manager-cainjector-68b757865b-vtssh\" (UID: \"c3ae2aa2-23e1-4b6f-b892-b0dd07e8fe5a\") " pod="cert-manager/cert-manager-cainjector-68b757865b-vtssh" Apr 22 21:12:01.937967 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:01.937815 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3ae2aa2-23e1-4b6f-b892-b0dd07e8fe5a-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-vtssh\" (UID: \"c3ae2aa2-23e1-4b6f-b892-b0dd07e8fe5a\") " pod="cert-manager/cert-manager-cainjector-68b757865b-vtssh" Apr 22 21:12:02.039203 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:02.039160 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3ae2aa2-23e1-4b6f-b892-b0dd07e8fe5a-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-vtssh\" (UID: \"c3ae2aa2-23e1-4b6f-b892-b0dd07e8fe5a\") " pod="cert-manager/cert-manager-cainjector-68b757865b-vtssh" Apr 22 21:12:02.039396 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:02.039289 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpvnq\" (UniqueName: \"kubernetes.io/projected/c3ae2aa2-23e1-4b6f-b892-b0dd07e8fe5a-kube-api-access-qpvnq\") pod \"cert-manager-cainjector-68b757865b-vtssh\" (UID: \"c3ae2aa2-23e1-4b6f-b892-b0dd07e8fe5a\") " pod="cert-manager/cert-manager-cainjector-68b757865b-vtssh" Apr 22 21:12:02.046485 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:02.046451 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3ae2aa2-23e1-4b6f-b892-b0dd07e8fe5a-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-vtssh\" (UID: \"c3ae2aa2-23e1-4b6f-b892-b0dd07e8fe5a\") " pod="cert-manager/cert-manager-cainjector-68b757865b-vtssh" Apr 22 21:12:02.046591 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:02.046547 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpvnq\" (UniqueName: \"kubernetes.io/projected/c3ae2aa2-23e1-4b6f-b892-b0dd07e8fe5a-kube-api-access-qpvnq\") pod \"cert-manager-cainjector-68b757865b-vtssh\" (UID: \"c3ae2aa2-23e1-4b6f-b892-b0dd07e8fe5a\") " pod="cert-manager/cert-manager-cainjector-68b757865b-vtssh" Apr 22 21:12:02.164167 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:02.164092 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-vtssh" Apr 22 21:12:02.283599 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:02.283569 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-vtssh"] Apr 22 21:12:02.286604 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:12:02.286574 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3ae2aa2_23e1_4b6f_b892_b0dd07e8fe5a.slice/crio-58bc88c3aaf64bddb491cb51200669b49e139af93e9a8c35e6e159fb0cd29881 WatchSource:0}: Error finding container 58bc88c3aaf64bddb491cb51200669b49e139af93e9a8c35e6e159fb0cd29881: Status 404 returned error can't find the container with id 58bc88c3aaf64bddb491cb51200669b49e139af93e9a8c35e6e159fb0cd29881 Apr 22 21:12:02.441113 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:02.441030 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-vtssh" event={"ID":"c3ae2aa2-23e1-4b6f-b892-b0dd07e8fe5a","Type":"ContainerStarted","Data":"58bc88c3aaf64bddb491cb51200669b49e139af93e9a8c35e6e159fb0cd29881"} Apr 22 21:12:06.454613 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:06.454578 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-vtssh" event={"ID":"c3ae2aa2-23e1-4b6f-b892-b0dd07e8fe5a","Type":"ContainerStarted","Data":"d61164911855eebfe67e9a84ecd6dfdc8e671fb656ea298515165a00df293e4d"} Apr 22 21:12:06.470312 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:06.470254 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-vtssh" podStartSLOduration=2.295147999 podStartE2EDuration="5.470234537s" podCreationTimestamp="2026-04-22 21:12:01 +0000 UTC" firstStartedPulling="2026-04-22 21:12:02.288462924 +0000 UTC m=+203.197763612" lastFinishedPulling="2026-04-22 21:12:05.463549461 +0000 UTC m=+206.372850150" observedRunningTime="2026-04-22 21:12:06.469753591 +0000 UTC m=+207.379054302" watchObservedRunningTime="2026-04-22 21:12:06.470234537 +0000 UTC m=+207.379535250" Apr 22 21:12:15.784922 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:15.784886 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-75zwz"] Apr 22 21:12:15.788153 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:15.788136 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-75zwz" Apr 22 21:12:15.790237 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:15.790220 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-pllh6\"" Apr 22 21:12:15.796525 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:15.796501 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-75zwz"] Apr 22 21:12:15.852604 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:15.852566 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zc49\" (UniqueName: \"kubernetes.io/projected/5ced75de-47d5-4b12-9f93-8883f6ed4136-kube-api-access-5zc49\") pod \"cert-manager-79c8d999ff-75zwz\" (UID: \"5ced75de-47d5-4b12-9f93-8883f6ed4136\") " pod="cert-manager/cert-manager-79c8d999ff-75zwz" Apr 22 21:12:15.852604 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:15.852604 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ced75de-47d5-4b12-9f93-8883f6ed4136-bound-sa-token\") pod \"cert-manager-79c8d999ff-75zwz\" (UID: \"5ced75de-47d5-4b12-9f93-8883f6ed4136\") " pod="cert-manager/cert-manager-79c8d999ff-75zwz" Apr 22 21:12:15.953168 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:15.953129 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zc49\" (UniqueName: \"kubernetes.io/projected/5ced75de-47d5-4b12-9f93-8883f6ed4136-kube-api-access-5zc49\") pod \"cert-manager-79c8d999ff-75zwz\" (UID: \"5ced75de-47d5-4b12-9f93-8883f6ed4136\") " pod="cert-manager/cert-manager-79c8d999ff-75zwz" Apr 22 21:12:15.953168 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:15.953176 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ced75de-47d5-4b12-9f93-8883f6ed4136-bound-sa-token\") pod \"cert-manager-79c8d999ff-75zwz\" (UID: \"5ced75de-47d5-4b12-9f93-8883f6ed4136\") " pod="cert-manager/cert-manager-79c8d999ff-75zwz" Apr 22 21:12:15.962310 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:15.962277 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ced75de-47d5-4b12-9f93-8883f6ed4136-bound-sa-token\") pod \"cert-manager-79c8d999ff-75zwz\" (UID: \"5ced75de-47d5-4b12-9f93-8883f6ed4136\") " pod="cert-manager/cert-manager-79c8d999ff-75zwz" Apr 22 21:12:15.962447 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:15.962429 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zc49\" (UniqueName: \"kubernetes.io/projected/5ced75de-47d5-4b12-9f93-8883f6ed4136-kube-api-access-5zc49\") pod \"cert-manager-79c8d999ff-75zwz\" (UID: \"5ced75de-47d5-4b12-9f93-8883f6ed4136\") " pod="cert-manager/cert-manager-79c8d999ff-75zwz" Apr 22 21:12:16.097514 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:16.097468 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-75zwz" Apr 22 21:12:16.234752 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:16.234718 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-75zwz"] Apr 22 21:12:16.237902 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:12:16.237868 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ced75de_47d5_4b12_9f93_8883f6ed4136.slice/crio-fb875c2b3f14fc43d342bfc2655ee7bfa830d3e4d4f5408362f4ae58a4c2fef1 WatchSource:0}: Error finding container fb875c2b3f14fc43d342bfc2655ee7bfa830d3e4d4f5408362f4ae58a4c2fef1: Status 404 returned error can't find the container with id fb875c2b3f14fc43d342bfc2655ee7bfa830d3e4d4f5408362f4ae58a4c2fef1 Apr 22 21:12:16.487133 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:16.487036 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-75zwz" event={"ID":"5ced75de-47d5-4b12-9f93-8883f6ed4136","Type":"ContainerStarted","Data":"a2e89211e872c70f901ba8d2f71e7b3717c5ee5d3be0f8ce0f23a8f07f51852d"} Apr 22 21:12:16.487133 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:16.487075 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-75zwz" event={"ID":"5ced75de-47d5-4b12-9f93-8883f6ed4136","Type":"ContainerStarted","Data":"fb875c2b3f14fc43d342bfc2655ee7bfa830d3e4d4f5408362f4ae58a4c2fef1"} Apr 22 21:12:16.502196 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:16.502148 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-75zwz" podStartSLOduration=1.50213516 podStartE2EDuration="1.50213516s" podCreationTimestamp="2026-04-22 21:12:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:12:16.500347163 +0000 UTC m=+217.409647873" watchObservedRunningTime="2026-04-22 21:12:16.50213516 +0000 UTC m=+217.411435871" Apr 22 21:12:31.940669 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:31.940609 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9"] Apr 22 21:12:31.945045 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:31.945018 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9" Apr 22 21:12:31.947427 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:31.947404 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 21:12:31.947575 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:31.947428 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 21:12:31.947575 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:31.947494 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-jq55h\"" Apr 22 21:12:31.947775 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:31.947758 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 21:12:31.948510 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:31.948493 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 21:12:31.962509 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:31.962488 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9"] Apr 22 21:12:32.097003 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:32.096963 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5872c721-2d2b-4d40-908f-b212dbaa3dcf-apiservice-cert\") pod \"opendatahub-operator-controller-manager-754bfc4657-nfnf9\" (UID: \"5872c721-2d2b-4d40-908f-b212dbaa3dcf\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9" Apr 22 21:12:32.097173 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:32.097020 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5872c721-2d2b-4d40-908f-b212dbaa3dcf-webhook-cert\") pod \"opendatahub-operator-controller-manager-754bfc4657-nfnf9\" (UID: \"5872c721-2d2b-4d40-908f-b212dbaa3dcf\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9" Apr 22 21:12:32.097173 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:32.097070 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x27p4\" (UniqueName: \"kubernetes.io/projected/5872c721-2d2b-4d40-908f-b212dbaa3dcf-kube-api-access-x27p4\") pod \"opendatahub-operator-controller-manager-754bfc4657-nfnf9\" (UID: \"5872c721-2d2b-4d40-908f-b212dbaa3dcf\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9" Apr 22 21:12:32.198412 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:32.198307 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x27p4\" (UniqueName: \"kubernetes.io/projected/5872c721-2d2b-4d40-908f-b212dbaa3dcf-kube-api-access-x27p4\") pod \"opendatahub-operator-controller-manager-754bfc4657-nfnf9\" (UID: \"5872c721-2d2b-4d40-908f-b212dbaa3dcf\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9" Apr 22 21:12:32.198412 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:32.198387 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5872c721-2d2b-4d40-908f-b212dbaa3dcf-apiservice-cert\") pod \"opendatahub-operator-controller-manager-754bfc4657-nfnf9\" (UID: \"5872c721-2d2b-4d40-908f-b212dbaa3dcf\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9" Apr 22 21:12:32.198602 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:32.198421 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5872c721-2d2b-4d40-908f-b212dbaa3dcf-webhook-cert\") pod \"opendatahub-operator-controller-manager-754bfc4657-nfnf9\" (UID: \"5872c721-2d2b-4d40-908f-b212dbaa3dcf\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9" Apr 22 21:12:32.201267 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:32.201237 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5872c721-2d2b-4d40-908f-b212dbaa3dcf-apiservice-cert\") pod \"opendatahub-operator-controller-manager-754bfc4657-nfnf9\" (UID: \"5872c721-2d2b-4d40-908f-b212dbaa3dcf\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9" Apr 22 21:12:32.201384 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:32.201297 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5872c721-2d2b-4d40-908f-b212dbaa3dcf-webhook-cert\") pod \"opendatahub-operator-controller-manager-754bfc4657-nfnf9\" (UID: \"5872c721-2d2b-4d40-908f-b212dbaa3dcf\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9" Apr 22 21:12:32.205760 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:32.205735 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x27p4\" (UniqueName: \"kubernetes.io/projected/5872c721-2d2b-4d40-908f-b212dbaa3dcf-kube-api-access-x27p4\") pod \"opendatahub-operator-controller-manager-754bfc4657-nfnf9\" (UID: \"5872c721-2d2b-4d40-908f-b212dbaa3dcf\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9" Apr 22 21:12:32.256631 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:32.256594 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9" Apr 22 21:12:32.397242 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:32.397215 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9"] Apr 22 21:12:32.399839 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:12:32.399812 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5872c721_2d2b_4d40_908f_b212dbaa3dcf.slice/crio-5d099b732ed3a04329b9432f1863541ff1be5a85014a72e0b26506056adf29e1 WatchSource:0}: Error finding container 5d099b732ed3a04329b9432f1863541ff1be5a85014a72e0b26506056adf29e1: Status 404 returned error can't find the container with id 5d099b732ed3a04329b9432f1863541ff1be5a85014a72e0b26506056adf29e1 Apr 22 21:12:32.534757 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:32.534667 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9" event={"ID":"5872c721-2d2b-4d40-908f-b212dbaa3dcf","Type":"ContainerStarted","Data":"5d099b732ed3a04329b9432f1863541ff1be5a85014a72e0b26506056adf29e1"} Apr 22 21:12:35.550254 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:35.550163 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9" event={"ID":"5872c721-2d2b-4d40-908f-b212dbaa3dcf","Type":"ContainerStarted","Data":"72fce2fa9cc3f6820172924d791fb3a7fee2146b7056813f77bbc5792cc6a70d"} Apr 22 21:12:35.550634 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:35.550281 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9" Apr 22 21:12:35.568829 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:35.568779 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9" podStartSLOduration=1.8586805 podStartE2EDuration="4.568765232s" podCreationTimestamp="2026-04-22 21:12:31 +0000 UTC" firstStartedPulling="2026-04-22 21:12:32.40146932 +0000 UTC m=+233.310770009" lastFinishedPulling="2026-04-22 21:12:35.111554043 +0000 UTC m=+236.020854741" observedRunningTime="2026-04-22 21:12:35.568631733 +0000 UTC m=+236.477932444" watchObservedRunningTime="2026-04-22 21:12:35.568765232 +0000 UTC m=+236.478065943" Apr 22 21:12:46.555061 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:46.555029 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nfnf9" Apr 22 21:12:53.883813 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:53.883735 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-hg6v8"] Apr 22 21:12:53.887032 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:53.887017 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-hg6v8" Apr 22 21:12:53.889124 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:53.889105 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 22 21:12:53.889230 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:53.889177 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-vzvv4\"" Apr 22 21:12:53.895803 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:53.895780 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-hg6v8"] Apr 22 21:12:53.978668 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:53.978606 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9zhq\" (UniqueName: \"kubernetes.io/projected/1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31-kube-api-access-f9zhq\") pod \"odh-model-controller-858dbf95b8-hg6v8\" (UID: \"1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31\") " pod="opendatahub/odh-model-controller-858dbf95b8-hg6v8" Apr 22 21:12:53.978822 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:53.978677 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31-cert\") pod \"odh-model-controller-858dbf95b8-hg6v8\" (UID: \"1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31\") " pod="opendatahub/odh-model-controller-858dbf95b8-hg6v8" Apr 22 21:12:54.079832 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:54.079797 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9zhq\" (UniqueName: \"kubernetes.io/projected/1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31-kube-api-access-f9zhq\") pod \"odh-model-controller-858dbf95b8-hg6v8\" (UID: \"1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31\") " pod="opendatahub/odh-model-controller-858dbf95b8-hg6v8" Apr 22 21:12:54.079832 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:54.079835 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31-cert\") pod \"odh-model-controller-858dbf95b8-hg6v8\" (UID: \"1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31\") " pod="opendatahub/odh-model-controller-858dbf95b8-hg6v8" Apr 22 21:12:54.080026 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:12:54.079934 2546 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 21:12:54.080026 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:12:54.079986 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31-cert podName:1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31 nodeName:}" failed. No retries permitted until 2026-04-22 21:12:54.579970295 +0000 UTC m=+255.489270985 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31-cert") pod "odh-model-controller-858dbf95b8-hg6v8" (UID: "1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31") : secret "odh-model-controller-webhook-cert" not found Apr 22 21:12:54.089505 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:54.089478 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9zhq\" (UniqueName: \"kubernetes.io/projected/1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31-kube-api-access-f9zhq\") pod \"odh-model-controller-858dbf95b8-hg6v8\" (UID: \"1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31\") " pod="opendatahub/odh-model-controller-858dbf95b8-hg6v8" Apr 22 21:12:54.584340 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:54.584301 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31-cert\") pod \"odh-model-controller-858dbf95b8-hg6v8\" (UID: \"1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31\") " pod="opendatahub/odh-model-controller-858dbf95b8-hg6v8" Apr 22 21:12:54.584539 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:12:54.584455 2546 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 21:12:54.584539 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:12:54.584521 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31-cert podName:1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31 nodeName:}" failed. No retries permitted until 2026-04-22 21:12:55.584505962 +0000 UTC m=+256.493806654 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31-cert") pod "odh-model-controller-858dbf95b8-hg6v8" (UID: "1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31") : secret "odh-model-controller-webhook-cert" not found Apr 22 21:12:55.593535 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:55.593498 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31-cert\") pod \"odh-model-controller-858dbf95b8-hg6v8\" (UID: \"1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31\") " pod="opendatahub/odh-model-controller-858dbf95b8-hg6v8" Apr 22 21:12:55.596052 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:55.596033 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31-cert\") pod \"odh-model-controller-858dbf95b8-hg6v8\" (UID: \"1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31\") " pod="opendatahub/odh-model-controller-858dbf95b8-hg6v8" Apr 22 21:12:55.697830 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:55.697798 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-hg6v8" Apr 22 21:12:55.820826 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:55.820799 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-hg6v8"] Apr 22 21:12:55.823578 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:12:55.823554 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ccc3e41_3f5b_442d_b8fd_e1f72bb33f31.slice/crio-73b282044648964a300643fcd2f32093d7dc9d6fa865a5243a480f3be29c255b WatchSource:0}: Error finding container 73b282044648964a300643fcd2f32093d7dc9d6fa865a5243a480f3be29c255b: Status 404 returned error can't find the container with id 73b282044648964a300643fcd2f32093d7dc9d6fa865a5243a480f3be29c255b Apr 22 21:12:56.621505 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:56.621468 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-hg6v8" event={"ID":"1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31","Type":"ContainerStarted","Data":"73b282044648964a300643fcd2f32093d7dc9d6fa865a5243a480f3be29c255b"} Apr 22 21:12:58.612196 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:58.612146 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-pgwps"] Apr 22 21:12:58.615452 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:58.615437 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-pgwps" Apr 22 21:12:58.617997 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:58.617969 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 22 21:12:58.618286 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:58.618268 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-dhl44\"" Apr 22 21:12:58.622897 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:58.622869 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-pgwps"] Apr 22 21:12:58.721077 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:58.721041 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7grg5\" (UniqueName: \"kubernetes.io/projected/9c84221e-24f9-4718-bbaf-6b5e5826c68e-kube-api-access-7grg5\") pod \"kserve-controller-manager-856948b99f-pgwps\" (UID: \"9c84221e-24f9-4718-bbaf-6b5e5826c68e\") " pod="opendatahub/kserve-controller-manager-856948b99f-pgwps" Apr 22 21:12:58.721265 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:58.721097 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c84221e-24f9-4718-bbaf-6b5e5826c68e-cert\") pod \"kserve-controller-manager-856948b99f-pgwps\" (UID: \"9c84221e-24f9-4718-bbaf-6b5e5826c68e\") " pod="opendatahub/kserve-controller-manager-856948b99f-pgwps" Apr 22 21:12:58.822669 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:58.822594 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7grg5\" (UniqueName: \"kubernetes.io/projected/9c84221e-24f9-4718-bbaf-6b5e5826c68e-kube-api-access-7grg5\") pod \"kserve-controller-manager-856948b99f-pgwps\" (UID: \"9c84221e-24f9-4718-bbaf-6b5e5826c68e\") " pod="opendatahub/kserve-controller-manager-856948b99f-pgwps" Apr 22 21:12:58.822856 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:58.822683 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c84221e-24f9-4718-bbaf-6b5e5826c68e-cert\") pod \"kserve-controller-manager-856948b99f-pgwps\" (UID: \"9c84221e-24f9-4718-bbaf-6b5e5826c68e\") " pod="opendatahub/kserve-controller-manager-856948b99f-pgwps" Apr 22 21:12:58.822856 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:12:58.822809 2546 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 22 21:12:58.822941 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:12:58.822871 2546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c84221e-24f9-4718-bbaf-6b5e5826c68e-cert podName:9c84221e-24f9-4718-bbaf-6b5e5826c68e nodeName:}" failed. No retries permitted until 2026-04-22 21:12:59.322854976 +0000 UTC m=+260.232155665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9c84221e-24f9-4718-bbaf-6b5e5826c68e-cert") pod "kserve-controller-manager-856948b99f-pgwps" (UID: "9c84221e-24f9-4718-bbaf-6b5e5826c68e") : secret "kserve-webhook-server-cert" not found Apr 22 21:12:58.832392 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:58.832363 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7grg5\" (UniqueName: \"kubernetes.io/projected/9c84221e-24f9-4718-bbaf-6b5e5826c68e-kube-api-access-7grg5\") pod \"kserve-controller-manager-856948b99f-pgwps\" (UID: \"9c84221e-24f9-4718-bbaf-6b5e5826c68e\") " pod="opendatahub/kserve-controller-manager-856948b99f-pgwps" Apr 22 21:12:59.327044 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:59.327013 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c84221e-24f9-4718-bbaf-6b5e5826c68e-cert\") pod \"kserve-controller-manager-856948b99f-pgwps\" (UID: \"9c84221e-24f9-4718-bbaf-6b5e5826c68e\") " pod="opendatahub/kserve-controller-manager-856948b99f-pgwps" Apr 22 21:12:59.329609 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:59.329570 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c84221e-24f9-4718-bbaf-6b5e5826c68e-cert\") pod \"kserve-controller-manager-856948b99f-pgwps\" (UID: \"9c84221e-24f9-4718-bbaf-6b5e5826c68e\") " pod="opendatahub/kserve-controller-manager-856948b99f-pgwps" Apr 22 21:12:59.530114 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:59.530019 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-pgwps" Apr 22 21:12:59.634772 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:59.634741 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-hg6v8" event={"ID":"1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31","Type":"ContainerStarted","Data":"6aea64760a5e78e34dfd97f99418315f3c1f4107635b3105123c5e20b6640529"} Apr 22 21:12:59.635357 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:59.635317 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-hg6v8" Apr 22 21:12:59.651947 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:59.651890 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-hg6v8" podStartSLOduration=3.302146465 podStartE2EDuration="6.651873281s" podCreationTimestamp="2026-04-22 21:12:53 +0000 UTC" firstStartedPulling="2026-04-22 21:12:55.825221438 +0000 UTC m=+256.734522127" lastFinishedPulling="2026-04-22 21:12:59.174948251 +0000 UTC m=+260.084248943" observedRunningTime="2026-04-22 21:12:59.650672537 +0000 UTC m=+260.559973249" watchObservedRunningTime="2026-04-22 21:12:59.651873281 +0000 UTC m=+260.561173991" Apr 22 21:12:59.664983 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:12:59.664864 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-pgwps"] Apr 22 21:12:59.667494 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:12:59.667456 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c84221e_24f9_4718_bbaf_6b5e5826c68e.slice/crio-eb1c685c09c1236dd51eeacc04d11b8aefc3aed01bfe9a0db10b61f28e8766b3 WatchSource:0}: Error finding container eb1c685c09c1236dd51eeacc04d11b8aefc3aed01bfe9a0db10b61f28e8766b3: Status 404 returned error can't find the container with id eb1c685c09c1236dd51eeacc04d11b8aefc3aed01bfe9a0db10b61f28e8766b3 Apr 22 21:13:00.639732 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:00.639691 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-pgwps" event={"ID":"9c84221e-24f9-4718-bbaf-6b5e5826c68e","Type":"ContainerStarted","Data":"eb1c685c09c1236dd51eeacc04d11b8aefc3aed01bfe9a0db10b61f28e8766b3"} Apr 22 21:13:00.861172 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:00.861137 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-567cb9698d-24qdg"] Apr 22 21:13:00.865238 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:00.865208 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-567cb9698d-24qdg" Apr 22 21:13:00.867481 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:00.867447 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 22 21:13:00.867610 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:00.867505 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-rsqq4\"" Apr 22 21:13:00.867741 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:00.867722 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 22 21:13:00.874300 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:00.874279 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-567cb9698d-24qdg"] Apr 22 21:13:00.940019 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:00.939928 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc4px\" (UniqueName: \"kubernetes.io/projected/fd430c6d-ebaf-440a-8f07-b941805b7dae-kube-api-access-rc4px\") pod \"kube-auth-proxy-567cb9698d-24qdg\" (UID: \"fd430c6d-ebaf-440a-8f07-b941805b7dae\") " pod="openshift-ingress/kube-auth-proxy-567cb9698d-24qdg" Apr 22 21:13:00.940019 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:00.939996 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd430c6d-ebaf-440a-8f07-b941805b7dae-tmp\") pod \"kube-auth-proxy-567cb9698d-24qdg\" (UID: \"fd430c6d-ebaf-440a-8f07-b941805b7dae\") " pod="openshift-ingress/kube-auth-proxy-567cb9698d-24qdg" Apr 22 21:13:00.940237 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:00.940052 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd430c6d-ebaf-440a-8f07-b941805b7dae-tls-certs\") pod \"kube-auth-proxy-567cb9698d-24qdg\" (UID: \"fd430c6d-ebaf-440a-8f07-b941805b7dae\") " pod="openshift-ingress/kube-auth-proxy-567cb9698d-24qdg" Apr 22 21:13:01.040761 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:01.040728 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rc4px\" (UniqueName: \"kubernetes.io/projected/fd430c6d-ebaf-440a-8f07-b941805b7dae-kube-api-access-rc4px\") pod \"kube-auth-proxy-567cb9698d-24qdg\" (UID: \"fd430c6d-ebaf-440a-8f07-b941805b7dae\") " pod="openshift-ingress/kube-auth-proxy-567cb9698d-24qdg" Apr 22 21:13:01.040956 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:01.040784 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd430c6d-ebaf-440a-8f07-b941805b7dae-tmp\") pod \"kube-auth-proxy-567cb9698d-24qdg\" (UID: \"fd430c6d-ebaf-440a-8f07-b941805b7dae\") " pod="openshift-ingress/kube-auth-proxy-567cb9698d-24qdg" Apr 22 21:13:01.040956 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:01.040834 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd430c6d-ebaf-440a-8f07-b941805b7dae-tls-certs\") pod \"kube-auth-proxy-567cb9698d-24qdg\" (UID: \"fd430c6d-ebaf-440a-8f07-b941805b7dae\") " pod="openshift-ingress/kube-auth-proxy-567cb9698d-24qdg" Apr 22 21:13:01.043682 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:01.043623 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd430c6d-ebaf-440a-8f07-b941805b7dae-tmp\") pod \"kube-auth-proxy-567cb9698d-24qdg\" (UID: \"fd430c6d-ebaf-440a-8f07-b941805b7dae\") " pod="openshift-ingress/kube-auth-proxy-567cb9698d-24qdg" Apr 22 21:13:01.044209 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:01.044187 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fd430c6d-ebaf-440a-8f07-b941805b7dae-tls-certs\") pod \"kube-auth-proxy-567cb9698d-24qdg\" (UID: \"fd430c6d-ebaf-440a-8f07-b941805b7dae\") " pod="openshift-ingress/kube-auth-proxy-567cb9698d-24qdg" Apr 22 21:13:01.048414 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:01.048384 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc4px\" (UniqueName: \"kubernetes.io/projected/fd430c6d-ebaf-440a-8f07-b941805b7dae-kube-api-access-rc4px\") pod \"kube-auth-proxy-567cb9698d-24qdg\" (UID: \"fd430c6d-ebaf-440a-8f07-b941805b7dae\") " pod="openshift-ingress/kube-auth-proxy-567cb9698d-24qdg" Apr 22 21:13:01.176850 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:01.176814 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-567cb9698d-24qdg" Apr 22 21:13:01.322915 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:01.322848 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-567cb9698d-24qdg"] Apr 22 21:13:01.324103 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:13:01.324064 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd430c6d_ebaf_440a_8f07_b941805b7dae.slice/crio-0f16807a0ffca87354961962138969e11c271bf334c433e39c9b9a5008e36303 WatchSource:0}: Error finding container 0f16807a0ffca87354961962138969e11c271bf334c433e39c9b9a5008e36303: Status 404 returned error can't find the container with id 0f16807a0ffca87354961962138969e11c271bf334c433e39c9b9a5008e36303 Apr 22 21:13:01.645163 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:01.645122 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-567cb9698d-24qdg" event={"ID":"fd430c6d-ebaf-440a-8f07-b941805b7dae","Type":"ContainerStarted","Data":"0f16807a0ffca87354961962138969e11c271bf334c433e39c9b9a5008e36303"} Apr 22 21:13:02.650538 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:02.650499 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-pgwps" event={"ID":"9c84221e-24f9-4718-bbaf-6b5e5826c68e","Type":"ContainerStarted","Data":"e7a54de97c76cbba176d20a6b3123874fd7fabf605402603f5c51bdd963844f2"} Apr 22 21:13:02.651028 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:02.650638 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-pgwps" Apr 22 21:13:02.667307 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:02.667225 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-pgwps" podStartSLOduration=2.229978641 podStartE2EDuration="4.667204679s" podCreationTimestamp="2026-04-22 21:12:58 +0000 UTC" firstStartedPulling="2026-04-22 21:12:59.668752008 +0000 UTC m=+260.578052700" lastFinishedPulling="2026-04-22 21:13:02.105978042 +0000 UTC m=+263.015278738" observedRunningTime="2026-04-22 21:13:02.664255553 +0000 UTC m=+263.573556265" watchObservedRunningTime="2026-04-22 21:13:02.667204679 +0000 UTC m=+263.576505390" Apr 22 21:13:05.663440 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:05.663393 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-567cb9698d-24qdg" event={"ID":"fd430c6d-ebaf-440a-8f07-b941805b7dae","Type":"ContainerStarted","Data":"c9109b21ca12dcad10a0d0de2b40c9f7d2827f3e6e0008b009b1d7b330725493"} Apr 22 21:13:05.679152 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:05.679108 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-567cb9698d-24qdg" podStartSLOduration=1.430190229 podStartE2EDuration="5.679093228s" podCreationTimestamp="2026-04-22 21:13:00 +0000 UTC" firstStartedPulling="2026-04-22 21:13:01.326611661 +0000 UTC m=+262.235912371" lastFinishedPulling="2026-04-22 21:13:05.575514677 +0000 UTC m=+266.484815370" observedRunningTime="2026-04-22 21:13:05.676672189 +0000 UTC m=+266.585972890" watchObservedRunningTime="2026-04-22 21:13:05.679093228 +0000 UTC m=+266.588393952" Apr 22 21:13:11.647440 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:11.647405 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-hg6v8" Apr 22 21:13:16.336624 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:16.336591 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-sch8w"] Apr 22 21:13:16.342092 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:16.342074 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sch8w" Apr 22 21:13:16.345077 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:16.345053 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 22 21:13:16.345184 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:16.345078 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 22 21:13:16.345184 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:16.345107 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-hn5k9\"" Apr 22 21:13:16.352780 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:16.352753 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-sch8w"] Apr 22 21:13:16.483523 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:16.483489 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/4d582bb1-ba87-4fd8-bc56-262e0f840d6c-operator-config\") pod \"servicemesh-operator3-55f49c5f94-sch8w\" (UID: \"4d582bb1-ba87-4fd8-bc56-262e0f840d6c\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-sch8w" Apr 22 21:13:16.483523 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:16.483523 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqn9t\" (UniqueName: \"kubernetes.io/projected/4d582bb1-ba87-4fd8-bc56-262e0f840d6c-kube-api-access-dqn9t\") pod \"servicemesh-operator3-55f49c5f94-sch8w\" (UID: \"4d582bb1-ba87-4fd8-bc56-262e0f840d6c\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-sch8w" Apr 22 21:13:16.584989 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:16.584953 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/4d582bb1-ba87-4fd8-bc56-262e0f840d6c-operator-config\") pod \"servicemesh-operator3-55f49c5f94-sch8w\" (UID: \"4d582bb1-ba87-4fd8-bc56-262e0f840d6c\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-sch8w" Apr 22 21:13:16.584989 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:16.584991 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqn9t\" (UniqueName: \"kubernetes.io/projected/4d582bb1-ba87-4fd8-bc56-262e0f840d6c-kube-api-access-dqn9t\") pod \"servicemesh-operator3-55f49c5f94-sch8w\" (UID: \"4d582bb1-ba87-4fd8-bc56-262e0f840d6c\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-sch8w" Apr 22 21:13:16.588204 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:16.588092 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/4d582bb1-ba87-4fd8-bc56-262e0f840d6c-operator-config\") pod \"servicemesh-operator3-55f49c5f94-sch8w\" (UID: \"4d582bb1-ba87-4fd8-bc56-262e0f840d6c\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-sch8w" Apr 22 21:13:16.596232 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:16.596208 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqn9t\" (UniqueName: \"kubernetes.io/projected/4d582bb1-ba87-4fd8-bc56-262e0f840d6c-kube-api-access-dqn9t\") pod \"servicemesh-operator3-55f49c5f94-sch8w\" (UID: \"4d582bb1-ba87-4fd8-bc56-262e0f840d6c\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-sch8w" Apr 22 21:13:16.654747 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:16.654715 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sch8w" Apr 22 21:13:16.785041 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:16.785014 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-sch8w"] Apr 22 21:13:16.788477 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:13:16.788450 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d582bb1_ba87_4fd8_bc56_262e0f840d6c.slice/crio-366324872b77bddebd961460e05c65461ad44921d34eb69be9b06648461c8ab7 WatchSource:0}: Error finding container 366324872b77bddebd961460e05c65461ad44921d34eb69be9b06648461c8ab7: Status 404 returned error can't find the container with id 366324872b77bddebd961460e05c65461ad44921d34eb69be9b06648461c8ab7 Apr 22 21:13:17.707926 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:17.707884 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sch8w" event={"ID":"4d582bb1-ba87-4fd8-bc56-262e0f840d6c","Type":"ContainerStarted","Data":"366324872b77bddebd961460e05c65461ad44921d34eb69be9b06648461c8ab7"} Apr 22 21:13:19.716312 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:19.716271 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sch8w" event={"ID":"4d582bb1-ba87-4fd8-bc56-262e0f840d6c","Type":"ContainerStarted","Data":"2c2636f640676383b9c9181b0adacd03bd2e072352ca8aaa4298091c764cb9fc"} Apr 22 21:13:19.716312 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:19.716320 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sch8w" Apr 22 21:13:19.733392 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:19.733343 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sch8w" podStartSLOduration=1.376484413 podStartE2EDuration="3.73332483s" podCreationTimestamp="2026-04-22 21:13:16 +0000 UTC" firstStartedPulling="2026-04-22 21:13:16.79104604 +0000 UTC m=+277.700346743" lastFinishedPulling="2026-04-22 21:13:19.147886468 +0000 UTC m=+280.057187160" observedRunningTime="2026-04-22 21:13:19.731819284 +0000 UTC m=+280.641120000" watchObservedRunningTime="2026-04-22 21:13:19.73332483 +0000 UTC m=+280.642625542" Apr 22 21:13:30.721015 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:30.720985 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-sch8w" Apr 22 21:13:31.706291 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.706250 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k"] Apr 22 21:13:31.710005 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.709981 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.712378 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.712352 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 22 21:13:31.712515 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.712397 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 21:13:31.712660 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.712631 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-rgnm7\"" Apr 22 21:13:31.712813 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.712796 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 22 21:13:31.712858 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.712834 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 22 21:13:31.722597 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.722573 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k"] Apr 22 21:13:31.815577 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.815532 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/87e8d385-5640-4048-b3c6-0a3f700b3efb-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.815799 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.815675 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/87e8d385-5640-4048-b3c6-0a3f700b3efb-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.815799 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.815712 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/87e8d385-5640-4048-b3c6-0a3f700b3efb-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.815799 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.815745 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdrl5\" (UniqueName: \"kubernetes.io/projected/87e8d385-5640-4048-b3c6-0a3f700b3efb-kube-api-access-qdrl5\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.815799 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.815790 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/87e8d385-5640-4048-b3c6-0a3f700b3efb-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.815976 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.815829 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/87e8d385-5640-4048-b3c6-0a3f700b3efb-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.815976 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.815855 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/87e8d385-5640-4048-b3c6-0a3f700b3efb-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.916997 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.916947 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/87e8d385-5640-4048-b3c6-0a3f700b3efb-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.917174 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.917038 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/87e8d385-5640-4048-b3c6-0a3f700b3efb-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.917174 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.917060 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/87e8d385-5640-4048-b3c6-0a3f700b3efb-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.917174 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.917093 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdrl5\" (UniqueName: \"kubernetes.io/projected/87e8d385-5640-4048-b3c6-0a3f700b3efb-kube-api-access-qdrl5\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.917174 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.917123 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/87e8d385-5640-4048-b3c6-0a3f700b3efb-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.917174 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.917147 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/87e8d385-5640-4048-b3c6-0a3f700b3efb-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.917440 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.917356 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/87e8d385-5640-4048-b3c6-0a3f700b3efb-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.918268 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.918238 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/87e8d385-5640-4048-b3c6-0a3f700b3efb-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.919942 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.919916 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/87e8d385-5640-4048-b3c6-0a3f700b3efb-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.920268 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.920243 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/87e8d385-5640-4048-b3c6-0a3f700b3efb-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.920268 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.920260 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/87e8d385-5640-4048-b3c6-0a3f700b3efb-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.920402 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.920320 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/87e8d385-5640-4048-b3c6-0a3f700b3efb-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.932877 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.932846 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/87e8d385-5640-4048-b3c6-0a3f700b3efb-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:31.933063 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:31.933042 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdrl5\" (UniqueName: \"kubernetes.io/projected/87e8d385-5640-4048-b3c6-0a3f700b3efb-kube-api-access-qdrl5\") pod \"istiod-openshift-gateway-55ff986f96-qzt9k\" (UID: \"87e8d385-5640-4048-b3c6-0a3f700b3efb\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:32.019736 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:32.019628 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:32.162934 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:32.162898 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k"] Apr 22 21:13:32.166454 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:13:32.166424 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87e8d385_5640_4048_b3c6_0a3f700b3efb.slice/crio-abfdd80f18bb4d367fe567ace3af8d396170f27b0ebbe5b099dc1a26a6f2f9b3 WatchSource:0}: Error finding container abfdd80f18bb4d367fe567ace3af8d396170f27b0ebbe5b099dc1a26a6f2f9b3: Status 404 returned error can't find the container with id abfdd80f18bb4d367fe567ace3af8d396170f27b0ebbe5b099dc1a26a6f2f9b3 Apr 22 21:13:32.760486 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:32.760454 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" event={"ID":"87e8d385-5640-4048-b3c6-0a3f700b3efb","Type":"ContainerStarted","Data":"abfdd80f18bb4d367fe567ace3af8d396170f27b0ebbe5b099dc1a26a6f2f9b3"} Apr 22 21:13:33.660770 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:33.660737 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-pgwps" Apr 22 21:13:34.873971 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:34.873933 2546 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 22 21:13:34.874228 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:34.874008 2546 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 22 21:13:35.773408 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:35.773372 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" event={"ID":"87e8d385-5640-4048-b3c6-0a3f700b3efb","Type":"ContainerStarted","Data":"1cfb6acf18d11b7398e0e630c31357edf4af2a51f8a74038645b4071799fc0d8"} Apr 22 21:13:35.773606 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:35.773562 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:35.775356 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:35.775321 2546 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-qzt9k container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 22 21:13:35.775484 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:35.775395 2546 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" podUID="87e8d385-5640-4048-b3c6-0a3f700b3efb" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 21:13:35.790389 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:35.790332 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" podStartSLOduration=2.084886887 podStartE2EDuration="4.790316817s" podCreationTimestamp="2026-04-22 21:13:31 +0000 UTC" firstStartedPulling="2026-04-22 21:13:32.168229222 +0000 UTC m=+293.077529913" lastFinishedPulling="2026-04-22 21:13:34.873659146 +0000 UTC m=+295.782959843" observedRunningTime="2026-04-22 21:13:35.789719996 +0000 UTC m=+296.699020711" watchObservedRunningTime="2026-04-22 21:13:35.790316817 +0000 UTC m=+296.699617529" Apr 22 21:13:36.777367 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:36.777339 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-qzt9k" Apr 22 21:13:39.626480 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:39.626445 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/1.log" Apr 22 21:13:39.627758 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:39.627735 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/1.log" Apr 22 21:13:39.629315 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:39.629298 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/0.log" Apr 22 21:13:39.630606 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:39.630583 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/0.log" Apr 22 21:13:39.637443 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:13:39.637422 2546 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 21:14:27.154947 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:27.154915 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-hkp26"] Apr 22 21:14:27.158280 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:27.158258 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-hkp26" Apr 22 21:14:27.160438 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:27.160411 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 21:14:27.160563 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:27.160436 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 21:14:27.161291 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:27.161276 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-dggq7\"" Apr 22 21:14:27.167741 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:27.167719 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-hkp26"] Apr 22 21:14:27.199521 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:27.199491 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9dg8\" (UniqueName: \"kubernetes.io/projected/9dc7841a-5e92-4e00-b500-3ad617ab57ac-kube-api-access-n9dg8\") pod \"authorino-operator-657f44b778-hkp26\" (UID: \"9dc7841a-5e92-4e00-b500-3ad617ab57ac\") " pod="kuadrant-system/authorino-operator-657f44b778-hkp26" Apr 22 21:14:27.300128 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:27.300094 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9dg8\" (UniqueName: \"kubernetes.io/projected/9dc7841a-5e92-4e00-b500-3ad617ab57ac-kube-api-access-n9dg8\") pod \"authorino-operator-657f44b778-hkp26\" (UID: \"9dc7841a-5e92-4e00-b500-3ad617ab57ac\") " pod="kuadrant-system/authorino-operator-657f44b778-hkp26" Apr 22 21:14:27.308743 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:27.308716 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9dg8\" (UniqueName: \"kubernetes.io/projected/9dc7841a-5e92-4e00-b500-3ad617ab57ac-kube-api-access-n9dg8\") pod \"authorino-operator-657f44b778-hkp26\" (UID: \"9dc7841a-5e92-4e00-b500-3ad617ab57ac\") " pod="kuadrant-system/authorino-operator-657f44b778-hkp26" Apr 22 21:14:27.469889 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:27.469801 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-hkp26" Apr 22 21:14:27.601169 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:27.600970 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-hkp26"] Apr 22 21:14:27.604187 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:14:27.604146 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dc7841a_5e92_4e00_b500_3ad617ab57ac.slice/crio-bff37f965448e1dc301413061ac8d6ca7bf4f90f1b5b7dca47fdd02dfa08e044 WatchSource:0}: Error finding container bff37f965448e1dc301413061ac8d6ca7bf4f90f1b5b7dca47fdd02dfa08e044: Status 404 returned error can't find the container with id bff37f965448e1dc301413061ac8d6ca7bf4f90f1b5b7dca47fdd02dfa08e044 Apr 22 21:14:27.606148 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:27.606134 2546 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:14:27.957909 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:27.957872 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-hkp26" event={"ID":"9dc7841a-5e92-4e00-b500-3ad617ab57ac","Type":"ContainerStarted","Data":"bff37f965448e1dc301413061ac8d6ca7bf4f90f1b5b7dca47fdd02dfa08e044"} Apr 22 21:14:29.967520 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:29.967481 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-hkp26" event={"ID":"9dc7841a-5e92-4e00-b500-3ad617ab57ac","Type":"ContainerStarted","Data":"b22c17aa43c36f16d97d62e46613c08f492e3125d4481c9a3cd0fdc97f446aae"} Apr 22 21:14:29.967918 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:29.967595 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-hkp26" Apr 22 21:14:29.984417 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:29.984360 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-hkp26" podStartSLOduration=1.214916327 podStartE2EDuration="2.984341491s" podCreationTimestamp="2026-04-22 21:14:27 +0000 UTC" firstStartedPulling="2026-04-22 21:14:27.606263703 +0000 UTC m=+348.515564392" lastFinishedPulling="2026-04-22 21:14:29.375688858 +0000 UTC m=+350.284989556" observedRunningTime="2026-04-22 21:14:29.982859825 +0000 UTC m=+350.892160536" watchObservedRunningTime="2026-04-22 21:14:29.984341491 +0000 UTC m=+350.893642203" Apr 22 21:14:32.980089 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:32.980055 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9"] Apr 22 21:14:32.983542 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:32.983525 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9" Apr 22 21:14:32.985533 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:32.985512 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-zmjvf\"" Apr 22 21:14:32.992313 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:32.992289 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9"] Apr 22 21:14:33.040478 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:33.040441 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g46dm\" (UniqueName: \"kubernetes.io/projected/05ff19ed-2062-47dd-8711-827b482ce924-kube-api-access-g46dm\") pod \"limitador-operator-controller-manager-85c4996f8c-c86d9\" (UID: \"05ff19ed-2062-47dd-8711-827b482ce924\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9" Apr 22 21:14:33.141773 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:33.141732 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g46dm\" (UniqueName: \"kubernetes.io/projected/05ff19ed-2062-47dd-8711-827b482ce924-kube-api-access-g46dm\") pod \"limitador-operator-controller-manager-85c4996f8c-c86d9\" (UID: \"05ff19ed-2062-47dd-8711-827b482ce924\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9" Apr 22 21:14:33.152133 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:33.152111 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g46dm\" (UniqueName: \"kubernetes.io/projected/05ff19ed-2062-47dd-8711-827b482ce924-kube-api-access-g46dm\") pod \"limitador-operator-controller-manager-85c4996f8c-c86d9\" (UID: \"05ff19ed-2062-47dd-8711-827b482ce924\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9" Apr 22 21:14:33.294692 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:33.294579 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9" Apr 22 21:14:33.420362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:33.420309 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9"] Apr 22 21:14:33.422996 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:14:33.422963 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05ff19ed_2062_47dd_8711_827b482ce924.slice/crio-b4e05e89fba3d019da285bfb991c7fa9df6fdf0a49452671cbdc47b91121db27 WatchSource:0}: Error finding container b4e05e89fba3d019da285bfb991c7fa9df6fdf0a49452671cbdc47b91121db27: Status 404 returned error can't find the container with id b4e05e89fba3d019da285bfb991c7fa9df6fdf0a49452671cbdc47b91121db27 Apr 22 21:14:33.983309 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:33.983267 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9" event={"ID":"05ff19ed-2062-47dd-8711-827b482ce924","Type":"ContainerStarted","Data":"b4e05e89fba3d019da285bfb991c7fa9df6fdf0a49452671cbdc47b91121db27"} Apr 22 21:14:35.992271 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:35.992238 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9" event={"ID":"05ff19ed-2062-47dd-8711-827b482ce924","Type":"ContainerStarted","Data":"d8c2565133fc686dca3742f167f27525ce216838c32874209db8f156c2fa67d5"} Apr 22 21:14:35.992634 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:35.992393 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9" Apr 22 21:14:36.007635 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:36.007584 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9" podStartSLOduration=2.085080135 podStartE2EDuration="4.007570719s" podCreationTimestamp="2026-04-22 21:14:32 +0000 UTC" firstStartedPulling="2026-04-22 21:14:33.424961316 +0000 UTC m=+354.334262004" lastFinishedPulling="2026-04-22 21:14:35.34745189 +0000 UTC m=+356.256752588" observedRunningTime="2026-04-22 21:14:36.006025811 +0000 UTC m=+356.915326521" watchObservedRunningTime="2026-04-22 21:14:36.007570719 +0000 UTC m=+356.916871429" Apr 22 21:14:40.972889 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:40.972850 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-hkp26" Apr 22 21:14:42.323118 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:42.323079 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9"] Apr 22 21:14:42.323605 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:42.323449 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9" podUID="05ff19ed-2062-47dd-8711-827b482ce924" containerName="manager" containerID="cri-o://d8c2565133fc686dca3742f167f27525ce216838c32874209db8f156c2fa67d5" gracePeriod=2 Apr 22 21:14:42.325224 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:42.325200 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9" Apr 22 21:14:42.329127 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:42.329102 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9"] Apr 22 21:14:42.337194 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:42.337165 2546 status_manager.go:895] "Failed to get status for pod" podUID="05ff19ed-2062-47dd-8711-827b482ce924" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9" err="pods \"limitador-operator-controller-manager-85c4996f8c-c86d9\" is forbidden: User \"system:node:ip-10-0-138-20.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-20.ec2.internal' and this object" Apr 22 21:14:42.561420 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:42.561395 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9" Apr 22 21:14:42.563435 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:42.563407 2546 status_manager.go:895] "Failed to get status for pod" podUID="05ff19ed-2062-47dd-8711-827b482ce924" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9" err="pods \"limitador-operator-controller-manager-85c4996f8c-c86d9\" is forbidden: User \"system:node:ip-10-0-138-20.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-20.ec2.internal' and this object" Apr 22 21:14:42.607008 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:42.606930 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g46dm\" (UniqueName: \"kubernetes.io/projected/05ff19ed-2062-47dd-8711-827b482ce924-kube-api-access-g46dm\") pod \"05ff19ed-2062-47dd-8711-827b482ce924\" (UID: \"05ff19ed-2062-47dd-8711-827b482ce924\") " Apr 22 21:14:42.609185 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:42.609162 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05ff19ed-2062-47dd-8711-827b482ce924-kube-api-access-g46dm" (OuterVolumeSpecName: "kube-api-access-g46dm") pod "05ff19ed-2062-47dd-8711-827b482ce924" (UID: "05ff19ed-2062-47dd-8711-827b482ce924"). InnerVolumeSpecName "kube-api-access-g46dm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:14:42.707924 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:42.707674 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g46dm\" (UniqueName: \"kubernetes.io/projected/05ff19ed-2062-47dd-8711-827b482ce924-kube-api-access-g46dm\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:14:43.022216 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.022126 2546 generic.go:358] "Generic (PLEG): container finished" podID="05ff19ed-2062-47dd-8711-827b482ce924" containerID="d8c2565133fc686dca3742f167f27525ce216838c32874209db8f156c2fa67d5" exitCode=0 Apr 22 21:14:43.022216 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.022179 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9" Apr 22 21:14:43.022391 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.022231 2546 scope.go:117] "RemoveContainer" containerID="d8c2565133fc686dca3742f167f27525ce216838c32874209db8f156c2fa67d5" Apr 22 21:14:43.024518 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.024488 2546 status_manager.go:895] "Failed to get status for pod" podUID="05ff19ed-2062-47dd-8711-827b482ce924" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9" err="pods \"limitador-operator-controller-manager-85c4996f8c-c86d9\" is forbidden: User \"system:node:ip-10-0-138-20.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-20.ec2.internal' and this object" Apr 22 21:14:43.031116 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.031100 2546 scope.go:117] "RemoveContainer" containerID="d8c2565133fc686dca3742f167f27525ce216838c32874209db8f156c2fa67d5" Apr 22 21:14:43.031381 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:14:43.031363 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8c2565133fc686dca3742f167f27525ce216838c32874209db8f156c2fa67d5\": container with ID starting with d8c2565133fc686dca3742f167f27525ce216838c32874209db8f156c2fa67d5 not found: ID does not exist" containerID="d8c2565133fc686dca3742f167f27525ce216838c32874209db8f156c2fa67d5" Apr 22 21:14:43.031429 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.031390 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8c2565133fc686dca3742f167f27525ce216838c32874209db8f156c2fa67d5"} err="failed to get container status \"d8c2565133fc686dca3742f167f27525ce216838c32874209db8f156c2fa67d5\": rpc error: code = NotFound desc = could not find container \"d8c2565133fc686dca3742f167f27525ce216838c32874209db8f156c2fa67d5\": container with ID starting with d8c2565133fc686dca3742f167f27525ce216838c32874209db8f156c2fa67d5 not found: ID does not exist" Apr 22 21:14:43.032283 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.032261 2546 status_manager.go:895] "Failed to get status for pod" podUID="05ff19ed-2062-47dd-8711-827b482ce924" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9" err="pods \"limitador-operator-controller-manager-85c4996f8c-c86d9\" is forbidden: User \"system:node:ip-10-0-138-20.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-20.ec2.internal' and this object" Apr 22 21:14:43.189504 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.189467 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5"] Apr 22 21:14:43.189916 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.189900 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05ff19ed-2062-47dd-8711-827b482ce924" containerName="manager" Apr 22 21:14:43.190007 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.189919 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="05ff19ed-2062-47dd-8711-827b482ce924" containerName="manager" Apr 22 21:14:43.190007 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.190000 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="05ff19ed-2062-47dd-8711-827b482ce924" containerName="manager" Apr 22 21:14:43.194225 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.194203 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5" Apr 22 21:14:43.196814 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.196787 2546 status_manager.go:895] "Failed to get status for pod" podUID="05ff19ed-2062-47dd-8711-827b482ce924" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c86d9" err="pods \"limitador-operator-controller-manager-85c4996f8c-c86d9\" is forbidden: User \"system:node:ip-10-0-138-20.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-20.ec2.internal' and this object" Apr 22 21:14:43.197317 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.197298 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-84s6w\"" Apr 22 21:14:43.203943 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.203921 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5"] Apr 22 21:14:43.211547 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.211522 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7kkg\" (UniqueName: \"kubernetes.io/projected/6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44-kube-api-access-n7kkg\") pod \"kuadrant-operator-controller-manager-55c7f4c975-f5tn5\" (UID: \"6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5" Apr 22 21:14:43.211691 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.211568 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-f5tn5\" (UID: \"6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5" Apr 22 21:14:43.313041 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.312954 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7kkg\" (UniqueName: \"kubernetes.io/projected/6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44-kube-api-access-n7kkg\") pod \"kuadrant-operator-controller-manager-55c7f4c975-f5tn5\" (UID: \"6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5" Apr 22 21:14:43.313041 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.313008 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-f5tn5\" (UID: \"6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5" Apr 22 21:14:43.313383 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.313368 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-f5tn5\" (UID: \"6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5" Apr 22 21:14:43.320612 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.320583 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7kkg\" (UniqueName: \"kubernetes.io/projected/6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44-kube-api-access-n7kkg\") pod \"kuadrant-operator-controller-manager-55c7f4c975-f5tn5\" (UID: \"6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5" Apr 22 21:14:43.505381 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.505341 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5" Apr 22 21:14:43.636954 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.636929 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5"] Apr 22 21:14:43.680167 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:43.680136 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05ff19ed-2062-47dd-8711-827b482ce924" path="/var/lib/kubelet/pods/05ff19ed-2062-47dd-8711-827b482ce924/volumes" Apr 22 21:14:44.027445 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:44.027349 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5" event={"ID":"6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44","Type":"ContainerStarted","Data":"376b363a19a9163cd60ea0e7dbb9c7027efb065d0d26e4d865aea2293c320402"} Apr 22 21:14:48.042598 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:48.042560 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5" event={"ID":"6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44","Type":"ContainerStarted","Data":"000c295b883552d249d78fea74fd06b8d4ba9f0093879449fdd85d7a5c563b71"} Apr 22 21:14:48.043059 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:48.042623 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5" Apr 22 21:14:48.062041 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:48.061994 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5" podStartSLOduration=1.385563064 podStartE2EDuration="5.061979101s" podCreationTimestamp="2026-04-22 21:14:43 +0000 UTC" firstStartedPulling="2026-04-22 21:14:43.643854888 +0000 UTC m=+364.553155579" lastFinishedPulling="2026-04-22 21:14:47.320270925 +0000 UTC m=+368.229571616" observedRunningTime="2026-04-22 21:14:48.059233627 +0000 UTC m=+368.968534343" watchObservedRunningTime="2026-04-22 21:14:48.061979101 +0000 UTC m=+368.971279811" Apr 22 21:14:59.048288 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:14:59.048260 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5" Apr 22 21:15:29.491105 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:29.491067 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7997c4447c-njt5q"] Apr 22 21:15:40.580862 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:40.580387 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-j59k5"] Apr 22 21:15:40.585337 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:40.585319 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-j59k5" Apr 22 21:15:40.587918 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:40.587893 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 22 21:15:40.588051 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:40.588030 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-sp6mv\"" Apr 22 21:15:40.594679 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:40.594627 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-j59k5"] Apr 22 21:15:40.624176 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:40.624140 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7bf94a99-1118-43f4-b7b1-cf2878a6c165-data\") pod \"postgres-868db5846d-j59k5\" (UID: \"7bf94a99-1118-43f4-b7b1-cf2878a6c165\") " pod="opendatahub/postgres-868db5846d-j59k5" Apr 22 21:15:40.624323 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:40.624184 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj6qv\" (UniqueName: \"kubernetes.io/projected/7bf94a99-1118-43f4-b7b1-cf2878a6c165-kube-api-access-nj6qv\") pod \"postgres-868db5846d-j59k5\" (UID: \"7bf94a99-1118-43f4-b7b1-cf2878a6c165\") " pod="opendatahub/postgres-868db5846d-j59k5" Apr 22 21:15:40.725628 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:40.725594 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7bf94a99-1118-43f4-b7b1-cf2878a6c165-data\") pod \"postgres-868db5846d-j59k5\" (UID: \"7bf94a99-1118-43f4-b7b1-cf2878a6c165\") " pod="opendatahub/postgres-868db5846d-j59k5" Apr 22 21:15:40.725836 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:40.725635 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nj6qv\" (UniqueName: \"kubernetes.io/projected/7bf94a99-1118-43f4-b7b1-cf2878a6c165-kube-api-access-nj6qv\") pod \"postgres-868db5846d-j59k5\" (UID: \"7bf94a99-1118-43f4-b7b1-cf2878a6c165\") " pod="opendatahub/postgres-868db5846d-j59k5" Apr 22 21:15:40.726028 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:40.726007 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7bf94a99-1118-43f4-b7b1-cf2878a6c165-data\") pod \"postgres-868db5846d-j59k5\" (UID: \"7bf94a99-1118-43f4-b7b1-cf2878a6c165\") " pod="opendatahub/postgres-868db5846d-j59k5" Apr 22 21:15:40.734046 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:40.734028 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj6qv\" (UniqueName: \"kubernetes.io/projected/7bf94a99-1118-43f4-b7b1-cf2878a6c165-kube-api-access-nj6qv\") pod \"postgres-868db5846d-j59k5\" (UID: \"7bf94a99-1118-43f4-b7b1-cf2878a6c165\") " pod="opendatahub/postgres-868db5846d-j59k5" Apr 22 21:15:40.898030 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:40.897948 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-j59k5" Apr 22 21:15:41.022097 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:41.022069 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-j59k5"] Apr 22 21:15:41.024308 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:15:41.024278 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bf94a99_1118_43f4_b7b1_cf2878a6c165.slice/crio-1ef0668e56d0983d22ac12013a5d78ef0dd1647c12203e80eb24c06e6a55c52e WatchSource:0}: Error finding container 1ef0668e56d0983d22ac12013a5d78ef0dd1647c12203e80eb24c06e6a55c52e: Status 404 returned error can't find the container with id 1ef0668e56d0983d22ac12013a5d78ef0dd1647c12203e80eb24c06e6a55c52e Apr 22 21:15:41.237790 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:41.237704 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-j59k5" event={"ID":"7bf94a99-1118-43f4-b7b1-cf2878a6c165","Type":"ContainerStarted","Data":"1ef0668e56d0983d22ac12013a5d78ef0dd1647c12203e80eb24c06e6a55c52e"} Apr 22 21:15:47.267275 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:47.267225 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-j59k5" event={"ID":"7bf94a99-1118-43f4-b7b1-cf2878a6c165","Type":"ContainerStarted","Data":"33dfdc4ec875324b9c1b7c5424db14468732735bcae42752213123f8dbbf6ec6"} Apr 22 21:15:47.267629 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:47.267341 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-j59k5" Apr 22 21:15:47.281882 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:47.281839 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-j59k5" podStartSLOduration=2.066175553 podStartE2EDuration="7.281822878s" podCreationTimestamp="2026-04-22 21:15:40 +0000 UTC" firstStartedPulling="2026-04-22 21:15:41.025906008 +0000 UTC m=+421.935206696" lastFinishedPulling="2026-04-22 21:15:46.241553332 +0000 UTC m=+427.150854021" observedRunningTime="2026-04-22 21:15:47.279596437 +0000 UTC m=+428.188897149" watchObservedRunningTime="2026-04-22 21:15:47.281822878 +0000 UTC m=+428.191123591" Apr 22 21:15:53.300278 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:53.300191 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-j59k5" Apr 22 21:15:54.512955 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.512905 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7997c4447c-njt5q" podUID="d5994f97-83e3-4ca4-a33b-0c478db407cf" containerName="console" containerID="cri-o://236acee0d1c0f718b313e736724aa79ccaafb998fa128024110eba453a83a13b" gracePeriod=15 Apr 22 21:15:54.760000 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.759979 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7997c4447c-njt5q_d5994f97-83e3-4ca4-a33b-0c478db407cf/console/0.log" Apr 22 21:15:54.760117 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.760039 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:15:54.854155 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.854122 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-console-config\") pod \"d5994f97-83e3-4ca4-a33b-0c478db407cf\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " Apr 22 21:15:54.854332 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.854195 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-service-ca\") pod \"d5994f97-83e3-4ca4-a33b-0c478db407cf\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " Apr 22 21:15:54.854332 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.854223 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5994f97-83e3-4ca4-a33b-0c478db407cf-console-oauth-config\") pod \"d5994f97-83e3-4ca4-a33b-0c478db407cf\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " Apr 22 21:15:54.854332 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.854248 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5994f97-83e3-4ca4-a33b-0c478db407cf-console-serving-cert\") pod \"d5994f97-83e3-4ca4-a33b-0c478db407cf\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " Apr 22 21:15:54.854332 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.854278 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-oauth-serving-cert\") pod \"d5994f97-83e3-4ca4-a33b-0c478db407cf\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " Apr 22 21:15:54.854332 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.854293 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzhqf\" (UniqueName: \"kubernetes.io/projected/d5994f97-83e3-4ca4-a33b-0c478db407cf-kube-api-access-wzhqf\") pod \"d5994f97-83e3-4ca4-a33b-0c478db407cf\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " Apr 22 21:15:54.854332 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.854331 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-trusted-ca-bundle\") pod \"d5994f97-83e3-4ca4-a33b-0c478db407cf\" (UID: \"d5994f97-83e3-4ca4-a33b-0c478db407cf\") " Apr 22 21:15:54.854797 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.854741 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-console-config" (OuterVolumeSpecName: "console-config") pod "d5994f97-83e3-4ca4-a33b-0c478db407cf" (UID: "d5994f97-83e3-4ca4-a33b-0c478db407cf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:15:54.854896 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.854816 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d5994f97-83e3-4ca4-a33b-0c478db407cf" (UID: "d5994f97-83e3-4ca4-a33b-0c478db407cf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:15:54.854896 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.854823 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-service-ca" (OuterVolumeSpecName: "service-ca") pod "d5994f97-83e3-4ca4-a33b-0c478db407cf" (UID: "d5994f97-83e3-4ca4-a33b-0c478db407cf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:15:54.854896 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.854882 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d5994f97-83e3-4ca4-a33b-0c478db407cf" (UID: "d5994f97-83e3-4ca4-a33b-0c478db407cf"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:15:54.856683 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.856634 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5994f97-83e3-4ca4-a33b-0c478db407cf-kube-api-access-wzhqf" (OuterVolumeSpecName: "kube-api-access-wzhqf") pod "d5994f97-83e3-4ca4-a33b-0c478db407cf" (UID: "d5994f97-83e3-4ca4-a33b-0c478db407cf"). InnerVolumeSpecName "kube-api-access-wzhqf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:15:54.856683 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.856637 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5994f97-83e3-4ca4-a33b-0c478db407cf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d5994f97-83e3-4ca4-a33b-0c478db407cf" (UID: "d5994f97-83e3-4ca4-a33b-0c478db407cf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:15:54.856801 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.856755 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5994f97-83e3-4ca4-a33b-0c478db407cf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d5994f97-83e3-4ca4-a33b-0c478db407cf" (UID: "d5994f97-83e3-4ca4-a33b-0c478db407cf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:15:54.955704 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.955631 2546 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5994f97-83e3-4ca4-a33b-0c478db407cf-console-oauth-config\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:15:54.955704 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.955704 2546 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5994f97-83e3-4ca4-a33b-0c478db407cf-console-serving-cert\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:15:54.955907 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.955721 2546 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-oauth-serving-cert\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:15:54.955907 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.955732 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wzhqf\" (UniqueName: \"kubernetes.io/projected/d5994f97-83e3-4ca4-a33b-0c478db407cf-kube-api-access-wzhqf\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:15:54.955907 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.955746 2546 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-trusted-ca-bundle\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:15:54.955907 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.955759 2546 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-console-config\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:15:54.955907 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:54.955768 2546 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5994f97-83e3-4ca4-a33b-0c478db407cf-service-ca\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:15:55.296377 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:55.296309 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7997c4447c-njt5q_d5994f97-83e3-4ca4-a33b-0c478db407cf/console/0.log" Apr 22 21:15:55.296377 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:55.296347 2546 generic.go:358] "Generic (PLEG): container finished" podID="d5994f97-83e3-4ca4-a33b-0c478db407cf" containerID="236acee0d1c0f718b313e736724aa79ccaafb998fa128024110eba453a83a13b" exitCode=2 Apr 22 21:15:55.296555 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:55.296383 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7997c4447c-njt5q" event={"ID":"d5994f97-83e3-4ca4-a33b-0c478db407cf","Type":"ContainerDied","Data":"236acee0d1c0f718b313e736724aa79ccaafb998fa128024110eba453a83a13b"} Apr 22 21:15:55.296555 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:55.296421 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7997c4447c-njt5q" event={"ID":"d5994f97-83e3-4ca4-a33b-0c478db407cf","Type":"ContainerDied","Data":"8111a72237871f5f805cae20063e55929dc6b5aa0de592bf173312c099d7a027"} Apr 22 21:15:55.296555 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:55.296425 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7997c4447c-njt5q" Apr 22 21:15:55.296555 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:55.296441 2546 scope.go:117] "RemoveContainer" containerID="236acee0d1c0f718b313e736724aa79ccaafb998fa128024110eba453a83a13b" Apr 22 21:15:55.308197 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:55.307588 2546 scope.go:117] "RemoveContainer" containerID="236acee0d1c0f718b313e736724aa79ccaafb998fa128024110eba453a83a13b" Apr 22 21:15:55.308197 ip-10-0-138-20 kubenswrapper[2546]: E0422 21:15:55.308123 2546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"236acee0d1c0f718b313e736724aa79ccaafb998fa128024110eba453a83a13b\": container with ID starting with 236acee0d1c0f718b313e736724aa79ccaafb998fa128024110eba453a83a13b not found: ID does not exist" containerID="236acee0d1c0f718b313e736724aa79ccaafb998fa128024110eba453a83a13b" Apr 22 21:15:55.308197 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:55.308162 2546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236acee0d1c0f718b313e736724aa79ccaafb998fa128024110eba453a83a13b"} err="failed to get container status \"236acee0d1c0f718b313e736724aa79ccaafb998fa128024110eba453a83a13b\": rpc error: code = NotFound desc = could not find container \"236acee0d1c0f718b313e736724aa79ccaafb998fa128024110eba453a83a13b\": container with ID starting with 236acee0d1c0f718b313e736724aa79ccaafb998fa128024110eba453a83a13b not found: ID does not exist" Apr 22 21:15:55.319535 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:55.319511 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7997c4447c-njt5q"] Apr 22 21:15:55.323461 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:55.323436 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7997c4447c-njt5q"] Apr 22 21:15:55.681593 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:15:55.681558 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5994f97-83e3-4ca4-a33b-0c478db407cf" path="/var/lib/kubelet/pods/d5994f97-83e3-4ca4-a33b-0c478db407cf/volumes" Apr 22 21:16:44.509313 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:44.509274 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-6fd7f46f76-qvkjz"] Apr 22 21:16:44.509778 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:44.509617 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5994f97-83e3-4ca4-a33b-0c478db407cf" containerName="console" Apr 22 21:16:44.509778 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:44.509629 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5994f97-83e3-4ca4-a33b-0c478db407cf" containerName="console" Apr 22 21:16:44.509778 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:44.509710 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5994f97-83e3-4ca4-a33b-0c478db407cf" containerName="console" Apr 22 21:16:44.513988 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:44.513972 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6fd7f46f76-qvkjz" Apr 22 21:16:44.516828 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:44.516807 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 22 21:16:44.516828 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:44.516819 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 22 21:16:44.517060 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:44.516810 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-qxg7n\"" Apr 22 21:16:44.522273 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:44.522253 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6fd7f46f76-qvkjz"] Apr 22 21:16:44.578934 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:44.578900 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv5bf\" (UniqueName: \"kubernetes.io/projected/f666a878-e278-4a3f-97b7-ac6be5bad19d-kube-api-access-nv5bf\") pod \"maas-api-6fd7f46f76-qvkjz\" (UID: \"f666a878-e278-4a3f-97b7-ac6be5bad19d\") " pod="opendatahub/maas-api-6fd7f46f76-qvkjz" Apr 22 21:16:44.579094 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:44.578961 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f666a878-e278-4a3f-97b7-ac6be5bad19d-maas-api-tls\") pod \"maas-api-6fd7f46f76-qvkjz\" (UID: \"f666a878-e278-4a3f-97b7-ac6be5bad19d\") " pod="opendatahub/maas-api-6fd7f46f76-qvkjz" Apr 22 21:16:44.679847 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:44.679813 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f666a878-e278-4a3f-97b7-ac6be5bad19d-maas-api-tls\") pod \"maas-api-6fd7f46f76-qvkjz\" (UID: \"f666a878-e278-4a3f-97b7-ac6be5bad19d\") " pod="opendatahub/maas-api-6fd7f46f76-qvkjz" Apr 22 21:16:44.680014 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:44.679867 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nv5bf\" (UniqueName: \"kubernetes.io/projected/f666a878-e278-4a3f-97b7-ac6be5bad19d-kube-api-access-nv5bf\") pod \"maas-api-6fd7f46f76-qvkjz\" (UID: \"f666a878-e278-4a3f-97b7-ac6be5bad19d\") " pod="opendatahub/maas-api-6fd7f46f76-qvkjz" Apr 22 21:16:44.682374 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:44.682353 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f666a878-e278-4a3f-97b7-ac6be5bad19d-maas-api-tls\") pod \"maas-api-6fd7f46f76-qvkjz\" (UID: \"f666a878-e278-4a3f-97b7-ac6be5bad19d\") " pod="opendatahub/maas-api-6fd7f46f76-qvkjz" Apr 22 21:16:44.687147 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:44.687128 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv5bf\" (UniqueName: \"kubernetes.io/projected/f666a878-e278-4a3f-97b7-ac6be5bad19d-kube-api-access-nv5bf\") pod \"maas-api-6fd7f46f76-qvkjz\" (UID: \"f666a878-e278-4a3f-97b7-ac6be5bad19d\") " pod="opendatahub/maas-api-6fd7f46f76-qvkjz" Apr 22 21:16:44.824484 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:44.824457 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6fd7f46f76-qvkjz" Apr 22 21:16:44.950498 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:44.950475 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6fd7f46f76-qvkjz"] Apr 22 21:16:44.952774 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:16:44.952747 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf666a878_e278_4a3f_97b7_ac6be5bad19d.slice/crio-c244cc93f895a8c9f7f6a2e7db1df0933caf54492143c820acb1c3b8cd2dbccf WatchSource:0}: Error finding container c244cc93f895a8c9f7f6a2e7db1df0933caf54492143c820acb1c3b8cd2dbccf: Status 404 returned error can't find the container with id c244cc93f895a8c9f7f6a2e7db1df0933caf54492143c820acb1c3b8cd2dbccf Apr 22 21:16:45.472309 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:45.472274 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6fd7f46f76-qvkjz" event={"ID":"f666a878-e278-4a3f-97b7-ac6be5bad19d","Type":"ContainerStarted","Data":"c244cc93f895a8c9f7f6a2e7db1df0933caf54492143c820acb1c3b8cd2dbccf"} Apr 22 21:16:47.483243 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:47.483211 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6fd7f46f76-qvkjz" event={"ID":"f666a878-e278-4a3f-97b7-ac6be5bad19d","Type":"ContainerStarted","Data":"56d9dea38c1e2a4623b4210865c9fd063a9279fc64153d4953185d18e8830eab"} Apr 22 21:16:47.483695 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:47.483379 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-6fd7f46f76-qvkjz" Apr 22 21:16:47.498200 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:47.498144 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-6fd7f46f76-qvkjz" podStartSLOduration=1.157941549 podStartE2EDuration="3.498128406s" podCreationTimestamp="2026-04-22 21:16:44 +0000 UTC" firstStartedPulling="2026-04-22 21:16:44.953920822 +0000 UTC m=+485.863221511" lastFinishedPulling="2026-04-22 21:16:47.294107666 +0000 UTC m=+488.203408368" observedRunningTime="2026-04-22 21:16:47.497007313 +0000 UTC m=+488.406308025" watchObservedRunningTime="2026-04-22 21:16:47.498128406 +0000 UTC m=+488.407429146" Apr 22 21:16:53.495564 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:53.495510 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-6fd7f46f76-qvkjz" Apr 22 21:16:55.930192 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:55.930155 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl"] Apr 22 21:16:55.933582 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:55.933562 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:55.935875 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:55.935853 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 22 21:16:55.935997 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:55.935896 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 22 21:16:55.936809 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:55.936787 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 22 21:16:55.936897 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:55.936799 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-bhkfk\"" Apr 22 21:16:55.941386 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:55.941365 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl"] Apr 22 21:16:56.080827 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.080797 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a8097950-c609-4ba8-86d5-eb2b7bd83d95-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl\" (UID: \"a8097950-c609-4ba8-86d5-eb2b7bd83d95\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.080994 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.080864 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8097950-c609-4ba8-86d5-eb2b7bd83d95-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl\" (UID: \"a8097950-c609-4ba8-86d5-eb2b7bd83d95\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.080994 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.080893 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a8097950-c609-4ba8-86d5-eb2b7bd83d95-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl\" (UID: \"a8097950-c609-4ba8-86d5-eb2b7bd83d95\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.080994 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.080910 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8097950-c609-4ba8-86d5-eb2b7bd83d95-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl\" (UID: \"a8097950-c609-4ba8-86d5-eb2b7bd83d95\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.080994 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.080930 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8097950-c609-4ba8-86d5-eb2b7bd83d95-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl\" (UID: \"a8097950-c609-4ba8-86d5-eb2b7bd83d95\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.080994 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.080964 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8qsf\" (UniqueName: \"kubernetes.io/projected/a8097950-c609-4ba8-86d5-eb2b7bd83d95-kube-api-access-l8qsf\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl\" (UID: \"a8097950-c609-4ba8-86d5-eb2b7bd83d95\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.181449 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.181372 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8qsf\" (UniqueName: \"kubernetes.io/projected/a8097950-c609-4ba8-86d5-eb2b7bd83d95-kube-api-access-l8qsf\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl\" (UID: \"a8097950-c609-4ba8-86d5-eb2b7bd83d95\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.181449 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.181434 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a8097950-c609-4ba8-86d5-eb2b7bd83d95-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl\" (UID: \"a8097950-c609-4ba8-86d5-eb2b7bd83d95\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.181698 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.181463 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8097950-c609-4ba8-86d5-eb2b7bd83d95-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl\" (UID: \"a8097950-c609-4ba8-86d5-eb2b7bd83d95\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.181698 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.181485 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a8097950-c609-4ba8-86d5-eb2b7bd83d95-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl\" (UID: \"a8097950-c609-4ba8-86d5-eb2b7bd83d95\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.181698 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.181501 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8097950-c609-4ba8-86d5-eb2b7bd83d95-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl\" (UID: \"a8097950-c609-4ba8-86d5-eb2b7bd83d95\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.181698 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.181590 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8097950-c609-4ba8-86d5-eb2b7bd83d95-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl\" (UID: \"a8097950-c609-4ba8-86d5-eb2b7bd83d95\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.181892 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.181866 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8097950-c609-4ba8-86d5-eb2b7bd83d95-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl\" (UID: \"a8097950-c609-4ba8-86d5-eb2b7bd83d95\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.181972 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.181925 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a8097950-c609-4ba8-86d5-eb2b7bd83d95-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl\" (UID: \"a8097950-c609-4ba8-86d5-eb2b7bd83d95\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.182064 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.182046 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8097950-c609-4ba8-86d5-eb2b7bd83d95-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl\" (UID: \"a8097950-c609-4ba8-86d5-eb2b7bd83d95\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.183994 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.183972 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a8097950-c609-4ba8-86d5-eb2b7bd83d95-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl\" (UID: \"a8097950-c609-4ba8-86d5-eb2b7bd83d95\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.184249 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.184231 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8097950-c609-4ba8-86d5-eb2b7bd83d95-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl\" (UID: \"a8097950-c609-4ba8-86d5-eb2b7bd83d95\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.188988 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.188957 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8qsf\" (UniqueName: \"kubernetes.io/projected/a8097950-c609-4ba8-86d5-eb2b7bd83d95-kube-api-access-l8qsf\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl\" (UID: \"a8097950-c609-4ba8-86d5-eb2b7bd83d95\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.244473 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.244440 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:16:56.384948 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.384916 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl"] Apr 22 21:16:56.387040 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:16:56.387014 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8097950_c609_4ba8_86d5_eb2b7bd83d95.slice/crio-6e9b4fa4fb424d400d2b6d36d229a8b7811bed856a92ea36b8c09dd00fb7a500 WatchSource:0}: Error finding container 6e9b4fa4fb424d400d2b6d36d229a8b7811bed856a92ea36b8c09dd00fb7a500: Status 404 returned error can't find the container with id 6e9b4fa4fb424d400d2b6d36d229a8b7811bed856a92ea36b8c09dd00fb7a500 Apr 22 21:16:56.516974 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:16:56.516891 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" event={"ID":"a8097950-c609-4ba8-86d5-eb2b7bd83d95","Type":"ContainerStarted","Data":"6e9b4fa4fb424d400d2b6d36d229a8b7811bed856a92ea36b8c09dd00fb7a500"} Apr 22 21:17:02.541387 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:02.541349 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" event={"ID":"a8097950-c609-4ba8-86d5-eb2b7bd83d95","Type":"ContainerStarted","Data":"bed44cc88093802c8d614f8939833a40e282a5ef46e0b003dead2a1cdb9906b6"} Apr 22 21:17:07.565834 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:07.565743 2546 generic.go:358] "Generic (PLEG): container finished" podID="a8097950-c609-4ba8-86d5-eb2b7bd83d95" containerID="bed44cc88093802c8d614f8939833a40e282a5ef46e0b003dead2a1cdb9906b6" exitCode=0 Apr 22 21:17:07.566220 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:07.565821 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" event={"ID":"a8097950-c609-4ba8-86d5-eb2b7bd83d95","Type":"ContainerDied","Data":"bed44cc88093802c8d614f8939833a40e282a5ef46e0b003dead2a1cdb9906b6"} Apr 22 21:17:09.574258 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:09.574223 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" event={"ID":"a8097950-c609-4ba8-86d5-eb2b7bd83d95","Type":"ContainerStarted","Data":"3f6966cb39009a156194c1d34b4483718733de420f4cd6a2ad3b0be0ba5ef954"} Apr 22 21:17:09.574683 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:09.574456 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:17:09.592549 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:09.592494 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" podStartSLOduration=2.281478043 podStartE2EDuration="14.592475493s" podCreationTimestamp="2026-04-22 21:16:55 +0000 UTC" firstStartedPulling="2026-04-22 21:16:56.38901988 +0000 UTC m=+497.298320570" lastFinishedPulling="2026-04-22 21:17:08.700017328 +0000 UTC m=+509.609318020" observedRunningTime="2026-04-22 21:17:09.590510976 +0000 UTC m=+510.499811690" watchObservedRunningTime="2026-04-22 21:17:09.592475493 +0000 UTC m=+510.501776205" Apr 22 21:17:20.595327 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:20.595232 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl" Apr 22 21:17:46.923104 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:46.923065 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d"] Apr 22 21:17:46.956920 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:46.956880 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d"] Apr 22 21:17:46.957091 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:46.957038 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:46.959719 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:46.959695 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 22 21:17:47.044493 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.044462 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d\" (UID: \"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.044683 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.044528 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d\" (UID: \"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.044683 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.044569 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d\" (UID: \"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.044683 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.044592 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79tff\" (UniqueName: \"kubernetes.io/projected/ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2-kube-api-access-79tff\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d\" (UID: \"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.044683 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.044671 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d\" (UID: \"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.044844 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.044715 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d\" (UID: \"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.145739 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.145703 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d\" (UID: \"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.145739 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.145746 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d\" (UID: \"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.145938 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.145798 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d\" (UID: \"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.145938 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.145819 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d\" (UID: \"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.145938 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.145841 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79tff\" (UniqueName: \"kubernetes.io/projected/ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2-kube-api-access-79tff\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d\" (UID: \"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.145938 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.145871 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d\" (UID: \"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.146238 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.146216 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d\" (UID: \"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.146319 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.146302 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d\" (UID: \"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.146589 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.146563 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d\" (UID: \"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.148418 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.148393 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d\" (UID: \"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.148511 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.148425 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d\" (UID: \"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.153785 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.153767 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79tff\" (UniqueName: \"kubernetes.io/projected/ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2-kube-api-access-79tff\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d\" (UID: \"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.268177 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.268092 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:47.398079 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.398024 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d"] Apr 22 21:17:47.401168 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:17:47.401140 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba44c18a_d496_4c3b_8eb5_fbb02ece9cf2.slice/crio-e187f3ff13f048a881ca3e3871858c3ecb090d68c5610e24669d3b95bee6c96e WatchSource:0}: Error finding container e187f3ff13f048a881ca3e3871858c3ecb090d68c5610e24669d3b95bee6c96e: Status 404 returned error can't find the container with id e187f3ff13f048a881ca3e3871858c3ecb090d68c5610e24669d3b95bee6c96e Apr 22 21:17:47.708901 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.708861 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" event={"ID":"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2","Type":"ContainerStarted","Data":"a6ec0fab8ff5504ef3198424b5dad1d9c87bc824b6c8f2418cf6b0cacfe21635"} Apr 22 21:17:47.708901 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:47.708904 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" event={"ID":"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2","Type":"ContainerStarted","Data":"e187f3ff13f048a881ca3e3871858c3ecb090d68c5610e24669d3b95bee6c96e"} Apr 22 21:17:53.731245 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:53.731208 2546 generic.go:358] "Generic (PLEG): container finished" podID="ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2" containerID="a6ec0fab8ff5504ef3198424b5dad1d9c87bc824b6c8f2418cf6b0cacfe21635" exitCode=0 Apr 22 21:17:53.731697 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:53.731292 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" event={"ID":"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2","Type":"ContainerDied","Data":"a6ec0fab8ff5504ef3198424b5dad1d9c87bc824b6c8f2418cf6b0cacfe21635"} Apr 22 21:17:54.738072 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:54.738035 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" event={"ID":"ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2","Type":"ContainerStarted","Data":"f5d2f73c51c39cc43029be241b2cd299b825542d5993b62899e82c7d766cbc91"} Apr 22 21:17:54.738469 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:54.738277 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:17:54.752558 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:17:54.752508 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" podStartSLOduration=8.570942796 podStartE2EDuration="8.752496529s" podCreationTimestamp="2026-04-22 21:17:46 +0000 UTC" firstStartedPulling="2026-04-22 21:17:53.731913659 +0000 UTC m=+554.641214347" lastFinishedPulling="2026-04-22 21:17:53.913467391 +0000 UTC m=+554.822768080" observedRunningTime="2026-04-22 21:17:54.751742343 +0000 UTC m=+555.661043055" watchObservedRunningTime="2026-04-22 21:17:54.752496529 +0000 UTC m=+555.661797239" Apr 22 21:18:05.755404 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:18:05.755368 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d" Apr 22 21:18:39.654053 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:18:39.654025 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/1.log" Apr 22 21:18:39.655139 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:18:39.655113 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/1.log" Apr 22 21:18:39.656557 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:18:39.656537 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/0.log" Apr 22 21:18:39.657440 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:18:39.657418 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/0.log" Apr 22 21:23:39.680005 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:23:39.679977 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/1.log" Apr 22 21:23:39.682286 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:23:39.682263 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/1.log" Apr 22 21:23:39.682475 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:23:39.682457 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/0.log" Apr 22 21:23:39.684507 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:23:39.684480 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/0.log" Apr 22 21:28:39.711157 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:28:39.711129 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/1.log" Apr 22 21:28:39.713410 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:28:39.713389 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/1.log" Apr 22 21:28:39.713540 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:28:39.713442 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/0.log" Apr 22 21:28:39.715912 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:28:39.715893 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/0.log" Apr 22 21:29:55.180222 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:29:55.180187 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5"] Apr 22 21:29:55.180757 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:29:55.180423 2546 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5" podUID="6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44" containerName="manager" containerID="cri-o://000c295b883552d249d78fea74fd06b8d4ba9f0093879449fdd85d7a5c563b71" gracePeriod=10 Apr 22 21:29:55.388886 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:29:55.388851 2546 generic.go:358] "Generic (PLEG): container finished" podID="6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44" containerID="000c295b883552d249d78fea74fd06b8d4ba9f0093879449fdd85d7a5c563b71" exitCode=0 Apr 22 21:29:55.389073 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:29:55.388935 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5" event={"ID":"6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44","Type":"ContainerDied","Data":"000c295b883552d249d78fea74fd06b8d4ba9f0093879449fdd85d7a5c563b71"} Apr 22 21:29:55.434141 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:29:55.434086 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5" Apr 22 21:29:55.476183 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:29:55.476148 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7kkg\" (UniqueName: \"kubernetes.io/projected/6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44-kube-api-access-n7kkg\") pod \"6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44\" (UID: \"6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44\") " Apr 22 21:29:55.476365 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:29:55.476266 2546 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44-extensions-socket-volume\") pod \"6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44\" (UID: \"6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44\") " Apr 22 21:29:55.476685 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:29:55.476625 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44" (UID: "6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:29:55.478705 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:29:55.478632 2546 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44-kube-api-access-n7kkg" (OuterVolumeSpecName: "kube-api-access-n7kkg") pod "6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44" (UID: "6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44"). InnerVolumeSpecName "kube-api-access-n7kkg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:29:55.577749 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:29:55.577719 2546 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44-extensions-socket-volume\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:29:55.577749 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:29:55.577746 2546 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n7kkg\" (UniqueName: \"kubernetes.io/projected/6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44-kube-api-access-n7kkg\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 22 21:29:56.393887 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:29:56.393848 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5" event={"ID":"6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44","Type":"ContainerDied","Data":"376b363a19a9163cd60ea0e7dbb9c7027efb065d0d26e4d865aea2293c320402"} Apr 22 21:29:56.394309 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:29:56.393896 2546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5" Apr 22 21:29:56.394309 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:29:56.393901 2546 scope.go:117] "RemoveContainer" containerID="000c295b883552d249d78fea74fd06b8d4ba9f0093879449fdd85d7a5c563b71" Apr 22 21:29:56.410180 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:29:56.410154 2546 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5"] Apr 22 21:29:56.413180 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:29:56.413144 2546 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-f5tn5"] Apr 22 21:29:57.681411 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:29:57.681364 2546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44" path="/var/lib/kubelet/pods/6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44/volumes" Apr 22 21:31:11.267541 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:11.267508 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-fzbtt"] Apr 22 21:31:11.268120 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:11.267932 2546 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44" containerName="manager" Apr 22 21:31:11.268120 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:11.267944 2546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44" containerName="manager" Apr 22 21:31:11.268120 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:11.268008 2546 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c6ce28d-cdd5-4bd1-8a13-d9ef9f9e8c44" containerName="manager" Apr 22 21:31:11.270857 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:11.270841 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-fzbtt" Apr 22 21:31:11.273261 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:11.273243 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-84s6w\"" Apr 22 21:31:11.281550 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:11.281528 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-fzbtt"] Apr 22 21:31:11.341084 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:11.341053 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0ed1448d-5a26-438d-918c-dcf0ba048a57-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-fzbtt\" (UID: \"0ed1448d-5a26-438d-918c-dcf0ba048a57\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-fzbtt" Apr 22 21:31:11.341251 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:11.341107 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psckt\" (UniqueName: \"kubernetes.io/projected/0ed1448d-5a26-438d-918c-dcf0ba048a57-kube-api-access-psckt\") pod \"kuadrant-operator-controller-manager-55c7f4c975-fzbtt\" (UID: \"0ed1448d-5a26-438d-918c-dcf0ba048a57\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-fzbtt" Apr 22 21:31:11.442362 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:11.442321 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0ed1448d-5a26-438d-918c-dcf0ba048a57-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-fzbtt\" (UID: \"0ed1448d-5a26-438d-918c-dcf0ba048a57\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-fzbtt" Apr 22 21:31:11.442536 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:11.442385 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psckt\" (UniqueName: \"kubernetes.io/projected/0ed1448d-5a26-438d-918c-dcf0ba048a57-kube-api-access-psckt\") pod \"kuadrant-operator-controller-manager-55c7f4c975-fzbtt\" (UID: \"0ed1448d-5a26-438d-918c-dcf0ba048a57\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-fzbtt" Apr 22 21:31:11.442742 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:11.442723 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0ed1448d-5a26-438d-918c-dcf0ba048a57-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-fzbtt\" (UID: \"0ed1448d-5a26-438d-918c-dcf0ba048a57\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-fzbtt" Apr 22 21:31:11.455634 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:11.455610 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psckt\" (UniqueName: \"kubernetes.io/projected/0ed1448d-5a26-438d-918c-dcf0ba048a57-kube-api-access-psckt\") pod \"kuadrant-operator-controller-manager-55c7f4c975-fzbtt\" (UID: \"0ed1448d-5a26-438d-918c-dcf0ba048a57\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-fzbtt" Apr 22 21:31:11.582533 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:11.582504 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-fzbtt" Apr 22 21:31:11.725150 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:11.725111 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-fzbtt"] Apr 22 21:31:11.727950 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:31:11.727925 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ed1448d_5a26_438d_918c_dcf0ba048a57.slice/crio-69cb12570d728bc67f59bac5797bfbc8e69d803ab853a4d860d657a12bb65805 WatchSource:0}: Error finding container 69cb12570d728bc67f59bac5797bfbc8e69d803ab853a4d860d657a12bb65805: Status 404 returned error can't find the container with id 69cb12570d728bc67f59bac5797bfbc8e69d803ab853a4d860d657a12bb65805 Apr 22 21:31:11.730277 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:11.730261 2546 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:31:12.676917 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:12.676880 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-fzbtt" event={"ID":"0ed1448d-5a26-438d-918c-dcf0ba048a57","Type":"ContainerStarted","Data":"c63f45e7084053f323acd7c36e96c87d43c9266dcf33347babfc131c1ecb239a"} Apr 22 21:31:12.676917 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:12.676917 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-fzbtt" event={"ID":"0ed1448d-5a26-438d-918c-dcf0ba048a57","Type":"ContainerStarted","Data":"69cb12570d728bc67f59bac5797bfbc8e69d803ab853a4d860d657a12bb65805"} Apr 22 21:31:12.677407 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:12.677099 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-fzbtt" Apr 22 21:31:12.696214 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:12.696154 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-fzbtt" podStartSLOduration=1.696138752 podStartE2EDuration="1.696138752s" podCreationTimestamp="2026-04-22 21:31:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:31:12.693956123 +0000 UTC m=+1353.603256835" watchObservedRunningTime="2026-04-22 21:31:12.696138752 +0000 UTC m=+1353.605439462" Apr 22 21:31:23.682488 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:31:23.682455 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-fzbtt" Apr 22 21:33:39.736033 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:33:39.735994 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/1.log" Apr 22 21:33:39.738296 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:33:39.738275 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/0.log" Apr 22 21:33:39.738905 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:33:39.738884 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/1.log" Apr 22 21:33:39.741147 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:33:39.741129 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/0.log" Apr 22 21:38:39.760267 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:38:39.760234 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/1.log" Apr 22 21:38:39.762671 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:38:39.762636 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/0.log" Apr 22 21:38:39.763911 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:38:39.763890 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/1.log" Apr 22 21:38:39.766307 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:38:39.766291 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/0.log" Apr 22 21:40:56.344007 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:40:56.343969 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-pgwps_9c84221e-24f9-4718-bbaf-6b5e5826c68e/manager/0.log" Apr 22 21:40:56.466556 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:40:56.466524 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-6fd7f46f76-qvkjz_f666a878-e278-4a3f-97b7-ac6be5bad19d/maas-api/0.log" Apr 22 21:40:56.718020 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:40:56.717985 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-hg6v8_1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31/manager/0.log" Apr 22 21:40:56.941828 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:40:56.941794 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-754bfc4657-nfnf9_5872c721-2d2b-4d40-908f-b212dbaa3dcf/manager/0.log" Apr 22 21:40:57.165340 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:40:57.165304 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-j59k5_7bf94a99-1118-43f4-b7b1-cf2878a6c165/postgres/0.log" Apr 22 21:40:58.510597 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:40:58.510566 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-hkp26_9dc7841a-5e92-4e00-b500-3ad617ab57ac/manager/0.log" Apr 22 21:40:58.968385 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:40:58.968349 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-fzbtt_0ed1448d-5a26-438d-918c-dcf0ba048a57/manager/0.log" Apr 22 21:40:59.651385 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:40:59.651351 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-qzt9k_87e8d385-5640-4048-b3c6-0a3f700b3efb/discovery/0.log" Apr 22 21:40:59.758713 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:40:59.758684 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-567cb9698d-24qdg_fd430c6d-ebaf-440a-8f07-b941805b7dae/kube-auth-proxy/0.log" Apr 22 21:41:00.087950 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:00.087923 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-569c57978d-cwfqs_0bf09f77-1234-462f-bd11-01faf61546b9/router/0.log" Apr 22 21:41:00.546802 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:00.546717 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl_a8097950-c609-4ba8-86d5-eb2b7bd83d95/storage-initializer/0.log" Apr 22 21:41:00.554236 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:00.554198 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-pggkl_a8097950-c609-4ba8-86d5-eb2b7bd83d95/main/0.log" Apr 22 21:41:00.896732 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:00.896687 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d_ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2/main/0.log" Apr 22 21:41:00.903197 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:00.903171 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-vbz2d_ba44c18a-d496-4c3b-8eb5-fbb02ece9cf2/storage-initializer/0.log" Apr 22 21:41:07.953281 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:07.953250 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-flkgv_e49b46d0-3a94-438f-bdc2-b93ad7f84cde/global-pull-secret-syncer/0.log" Apr 22 21:41:08.191375 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:08.191344 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-l25nv_a782077c-00c8-4baa-8f9c-ce021c1c5d9c/konnectivity-agent/0.log" Apr 22 21:41:08.277446 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:08.277372 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-20.ec2.internal_2219300bfb8a5cea9f09d55fabfc69ab/haproxy/0.log" Apr 22 21:41:12.483954 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:12.483917 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-hkp26_9dc7841a-5e92-4e00-b500-3ad617ab57ac/manager/0.log" Apr 22 21:41:12.678499 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:12.678451 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-fzbtt_0ed1448d-5a26-438d-918c-dcf0ba048a57/manager/0.log" Apr 22 21:41:14.463773 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:14.463681 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-l798c_318e56a6-6fae-4d59-84d7-4a832ad3b200/cluster-monitoring-operator/0.log" Apr 22 21:41:14.687136 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:14.687105 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9xnmz_fa0a2fd2-a715-4e30-823a-cabc466c9686/node-exporter/0.log" Apr 22 21:41:14.705963 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:14.705937 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9xnmz_fa0a2fd2-a715-4e30-823a-cabc466c9686/kube-rbac-proxy/0.log" Apr 22 21:41:14.733847 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:14.733779 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9xnmz_fa0a2fd2-a715-4e30-823a-cabc466c9686/init-textfile/0.log" Apr 22 21:41:15.073955 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:15.073927 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-2kkrv_09ad8720-76bf-492a-b6dc-52abf1046e57/prometheus-operator/0.log" Apr 22 21:41:15.091462 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:15.091432 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-2kkrv_09ad8720-76bf-492a-b6dc-52abf1046e57/kube-rbac-proxy/0.log" Apr 22 21:41:16.321068 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.321030 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-lt772_fb72504e-c4a7-4b19-a04d-f51d0fd91812/networking-console-plugin/0.log" Apr 22 21:41:16.530500 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.530464 2546 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv"] Apr 22 21:41:16.534516 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.534492 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:16.536675 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.536624 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nnb7p\"/\"openshift-service-ca.crt\"" Apr 22 21:41:16.537241 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.537221 2546 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nnb7p\"/\"default-dockercfg-2cdrj\"" Apr 22 21:41:16.537333 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.537262 2546 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nnb7p\"/\"kube-root-ca.crt\"" Apr 22 21:41:16.543260 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.543231 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv"] Apr 22 21:41:16.599187 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.599149 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6317f293-c9cc-4671-961b-3b8b11553081-lib-modules\") pod \"perf-node-gather-daemonset-sjsvv\" (UID: \"6317f293-c9cc-4671-961b-3b8b11553081\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:16.599383 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.599203 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6317f293-c9cc-4671-961b-3b8b11553081-proc\") pod \"perf-node-gather-daemonset-sjsvv\" (UID: \"6317f293-c9cc-4671-961b-3b8b11553081\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:16.599383 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.599235 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6317f293-c9cc-4671-961b-3b8b11553081-podres\") pod \"perf-node-gather-daemonset-sjsvv\" (UID: \"6317f293-c9cc-4671-961b-3b8b11553081\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:16.599383 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.599324 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q6rv\" (UniqueName: \"kubernetes.io/projected/6317f293-c9cc-4671-961b-3b8b11553081-kube-api-access-5q6rv\") pod \"perf-node-gather-daemonset-sjsvv\" (UID: \"6317f293-c9cc-4671-961b-3b8b11553081\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:16.599383 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.599366 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6317f293-c9cc-4671-961b-3b8b11553081-sys\") pod \"perf-node-gather-daemonset-sjsvv\" (UID: \"6317f293-c9cc-4671-961b-3b8b11553081\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:16.700102 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.700063 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6317f293-c9cc-4671-961b-3b8b11553081-podres\") pod \"perf-node-gather-daemonset-sjsvv\" (UID: \"6317f293-c9cc-4671-961b-3b8b11553081\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:16.700322 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.700131 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5q6rv\" (UniqueName: \"kubernetes.io/projected/6317f293-c9cc-4671-961b-3b8b11553081-kube-api-access-5q6rv\") pod \"perf-node-gather-daemonset-sjsvv\" (UID: \"6317f293-c9cc-4671-961b-3b8b11553081\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:16.700322 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.700178 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6317f293-c9cc-4671-961b-3b8b11553081-sys\") pod \"perf-node-gather-daemonset-sjsvv\" (UID: \"6317f293-c9cc-4671-961b-3b8b11553081\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:16.700322 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.700246 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6317f293-c9cc-4671-961b-3b8b11553081-lib-modules\") pod \"perf-node-gather-daemonset-sjsvv\" (UID: \"6317f293-c9cc-4671-961b-3b8b11553081\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:16.700322 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.700253 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6317f293-c9cc-4671-961b-3b8b11553081-podres\") pod \"perf-node-gather-daemonset-sjsvv\" (UID: \"6317f293-c9cc-4671-961b-3b8b11553081\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:16.700322 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.700292 2546 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6317f293-c9cc-4671-961b-3b8b11553081-proc\") pod \"perf-node-gather-daemonset-sjsvv\" (UID: \"6317f293-c9cc-4671-961b-3b8b11553081\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:16.700514 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.700347 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6317f293-c9cc-4671-961b-3b8b11553081-sys\") pod \"perf-node-gather-daemonset-sjsvv\" (UID: \"6317f293-c9cc-4671-961b-3b8b11553081\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:16.700514 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.700352 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6317f293-c9cc-4671-961b-3b8b11553081-proc\") pod \"perf-node-gather-daemonset-sjsvv\" (UID: \"6317f293-c9cc-4671-961b-3b8b11553081\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:16.700514 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.700440 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6317f293-c9cc-4671-961b-3b8b11553081-lib-modules\") pod \"perf-node-gather-daemonset-sjsvv\" (UID: \"6317f293-c9cc-4671-961b-3b8b11553081\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:16.707852 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.707825 2546 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q6rv\" (UniqueName: \"kubernetes.io/projected/6317f293-c9cc-4671-961b-3b8b11553081-kube-api-access-5q6rv\") pod \"perf-node-gather-daemonset-sjsvv\" (UID: \"6317f293-c9cc-4671-961b-3b8b11553081\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:16.846530 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.846481 2546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:16.852877 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.852803 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/1.log" Apr 22 21:41:16.862670 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.862618 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2rdf2_7e715539-0e5b-418b-a686-dfcf837def03/console-operator/2.log" Apr 22 21:41:16.974750 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.974722 2546 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv"] Apr 22 21:41:16.977307 ip-10-0-138-20 kubenswrapper[2546]: W0422 21:41:16.977279 2546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6317f293_c9cc_4671_961b_3b8b11553081.slice/crio-01159ae1440a3ea4b52031bad46764277ad316825b5cd785d79f9be2ee5ef4a4 WatchSource:0}: Error finding container 01159ae1440a3ea4b52031bad46764277ad316825b5cd785d79f9be2ee5ef4a4: Status 404 returned error can't find the container with id 01159ae1440a3ea4b52031bad46764277ad316825b5cd785d79f9be2ee5ef4a4 Apr 22 21:41:16.978946 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:16.978931 2546 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:41:17.846180 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:17.846140 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" event={"ID":"6317f293-c9cc-4671-961b-3b8b11553081","Type":"ContainerStarted","Data":"64b1586dd0848e87083543d5eec61f2b956409c6132940d26bb3a2386a872940"} Apr 22 21:41:17.846180 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:17.846185 2546 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" event={"ID":"6317f293-c9cc-4671-961b-3b8b11553081","Type":"ContainerStarted","Data":"01159ae1440a3ea4b52031bad46764277ad316825b5cd785d79f9be2ee5ef4a4"} Apr 22 21:41:17.846605 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:17.846210 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:17.848449 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:17.848429 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-98ckw_35136bae-7913-4fe1-9aa1-51925e06b1c6/volume-data-source-validator/0.log" Apr 22 21:41:17.860089 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:17.859795 2546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" podStartSLOduration=1.859778011 podStartE2EDuration="1.859778011s" podCreationTimestamp="2026-04-22 21:41:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:41:17.858898513 +0000 UTC m=+1958.768199222" watchObservedRunningTime="2026-04-22 21:41:17.859778011 +0000 UTC m=+1958.769078722" Apr 22 21:41:18.600047 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:18.600017 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-264s7_3812e9ff-4663-427d-8f9a-a20d66323320/dns/0.log" Apr 22 21:41:18.620143 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:18.620107 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-264s7_3812e9ff-4663-427d-8f9a-a20d66323320/kube-rbac-proxy/0.log" Apr 22 21:41:18.768997 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:18.768959 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-v6vgf_fbb54413-9542-4220-90ec-df660b4891fa/dns-node-resolver/0.log" Apr 22 21:41:19.271843 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:19.271812 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zbnft_ee933643-0139-4bfd-9d0d-903d3404df30/node-ca/0.log" Apr 22 21:41:20.175524 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:20.175446 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-qzt9k_87e8d385-5640-4048-b3c6-0a3f700b3efb/discovery/0.log" Apr 22 21:41:20.193406 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:20.193376 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-567cb9698d-24qdg_fd430c6d-ebaf-440a-8f07-b941805b7dae/kube-auth-proxy/0.log" Apr 22 21:41:20.289921 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:20.289890 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-569c57978d-cwfqs_0bf09f77-1234-462f-bd11-01faf61546b9/router/0.log" Apr 22 21:41:20.764954 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:20.764923 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xbm9h_dd46fc2d-0bfc-4e1b-b429-0d14ff029faf/serve-healthcheck-canary/0.log" Apr 22 21:41:21.214617 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:21.214586 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-44nf9_dd3b0638-5a78-4ca8-8274-4653bf8b98d0/insights-operator/0.log" Apr 22 21:41:21.215272 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:21.215247 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-44nf9_dd3b0638-5a78-4ca8-8274-4653bf8b98d0/insights-operator/1.log" Apr 22 21:41:21.359708 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:21.359678 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q4bdt_b818737b-3647-434e-b75c-fa264c7a4dd2/kube-rbac-proxy/0.log" Apr 22 21:41:21.378507 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:21.378478 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q4bdt_b818737b-3647-434e-b75c-fa264c7a4dd2/exporter/0.log" Apr 22 21:41:21.398842 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:21.398807 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q4bdt_b818737b-3647-434e-b75c-fa264c7a4dd2/extractor/0.log" Apr 22 21:41:23.237133 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:23.237102 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-pgwps_9c84221e-24f9-4718-bbaf-6b5e5826c68e/manager/0.log" Apr 22 21:41:23.284369 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:23.284334 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-6fd7f46f76-qvkjz_f666a878-e278-4a3f-97b7-ac6be5bad19d/maas-api/0.log" Apr 22 21:41:23.373161 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:23.373109 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-hg6v8_1ccc3e41-3f5b-442d-b8fd-e1f72bb33f31/manager/0.log" Apr 22 21:41:23.424879 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:23.424845 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-754bfc4657-nfnf9_5872c721-2d2b-4d40-908f-b212dbaa3dcf/manager/0.log" Apr 22 21:41:23.482330 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:23.482301 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-j59k5_7bf94a99-1118-43f4-b7b1-cf2878a6c165/postgres/0.log" Apr 22 21:41:23.861149 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:23.861119 2546 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-sjsvv" Apr 22 21:41:29.466698 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:29.466665 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-567gg_2d34322a-f3f9-4e65-8714-cec36115f27e/kube-storage-version-migrator-operator/1.log" Apr 22 21:41:29.468306 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:29.468283 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-567gg_2d34322a-f3f9-4e65-8714-cec36115f27e/kube-storage-version-migrator-operator/0.log" Apr 22 21:41:30.822463 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:30.822432 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v5b9n_0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63/kube-multus-additional-cni-plugins/0.log" Apr 22 21:41:30.841855 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:30.841826 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v5b9n_0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63/egress-router-binary-copy/0.log" Apr 22 21:41:30.862129 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:30.862100 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v5b9n_0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63/cni-plugins/0.log" Apr 22 21:41:30.880492 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:30.880472 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v5b9n_0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63/bond-cni-plugin/0.log" Apr 22 21:41:30.900736 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:30.900707 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v5b9n_0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63/routeoverride-cni/0.log" Apr 22 21:41:30.919067 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:30.919043 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v5b9n_0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63/whereabouts-cni-bincopy/0.log" Apr 22 21:41:30.941620 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:30.941596 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v5b9n_0a4c75c0-b2f0-4d74-99f6-47fc56e8fa63/whereabouts-cni/0.log" Apr 22 21:41:30.997435 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:30.997409 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tdtws_aab52e3e-8c7f-4bd0-92a2-9f6136daf78d/kube-multus/0.log" Apr 22 21:41:31.051194 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:31.051158 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4476p_046a767f-6fe0-4c57-ba82-6a2ce75f7864/network-metrics-daemon/0.log" Apr 22 21:41:31.069291 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:31.069267 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4476p_046a767f-6fe0-4c57-ba82-6a2ce75f7864/kube-rbac-proxy/0.log" Apr 22 21:41:32.164330 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:32.164298 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-controller/0.log" Apr 22 21:41:32.179490 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:32.179464 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/0.log" Apr 22 21:41:32.199262 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:32.199233 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovn-acl-logging/1.log" Apr 22 21:41:32.218109 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:32.218084 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/kube-rbac-proxy-node/0.log" Apr 22 21:41:32.236035 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:32.236011 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 21:41:32.256025 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:32.255998 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/northd/0.log" Apr 22 21:41:32.274281 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:32.274251 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/nbdb/0.log" Apr 22 21:41:32.294156 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:32.294133 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/sbdb/0.log" Apr 22 21:41:32.460787 ip-10-0-138-20 kubenswrapper[2546]: I0422 21:41:32.460710 2546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6flqf_195f14c0-2e98-4e09-8198-0f6696d61fc6/ovnkube-controller/0.log"