Apr 20 20:03:11.479438 ip-10-0-136-158 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 20:03:11.479454 ip-10-0-136-158 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 20:03:11.479465 ip-10-0-136-158 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 20:03:11.479803 ip-10-0-136-158 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 20:03:21.637851 ip-10-0-136-158 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 20:03:21.637873 ip-10-0-136-158 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 196bdd881c0a4d2d8c326d1e82c9d744 -- Apr 20 20:05:56.344553 ip-10-0-136-158 systemd[1]: Starting Kubernetes Kubelet... Apr 20 20:05:56.728109 ip-10-0-136-158 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:56.728109 ip-10-0-136-158 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 20:05:56.728109 ip-10-0-136-158 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:56.728109 ip-10-0-136-158 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 20:05:56.728109 ip-10-0-136-158 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:56.730496 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.730407 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 20:05:56.735455 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735435 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:56.735455 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735452 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:56.735455 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735458 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:56.735455 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735462 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735466 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735471 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735474 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735479 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735483 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735487 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735491 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735496 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735500 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735504 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735507 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735517 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735522 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735525 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735529 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735533 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735536 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735540 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735544 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:56.735681 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735547 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735551 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735555 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735559 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735563 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735567 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735572 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735576 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735581 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735585 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735589 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735593 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735598 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735602 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735608 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735613 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735618 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735623 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735627 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:56.736436 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735633 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:56.736943 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735639 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:56.736943 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735645 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:56.736943 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735650 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:56.736943 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735654 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:56.736943 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735659 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:56.736943 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735714 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:56.736943 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735756 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:56.736943 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735762 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:56.736943 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735767 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:56.736943 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.735773 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:56.738029 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.737889 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:56.738068 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738032 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:56.738068 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738036 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:56.738068 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738039 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:56.738068 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738042 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:56.738068 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738045 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:56.738068 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738048 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:56.738068 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738052 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:56.738068 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738056 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:56.738068 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738059 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:56.738068 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738061 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:56.738068 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738064 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:56.738068 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738067 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:56.738068 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738069 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:56.738068 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738072 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:56.738068 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738075 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:56.738444 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738080 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:56.738444 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738084 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:56.738444 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738088 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:56.738444 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738091 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:56.738444 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738093 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:56.738444 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738096 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:56.738444 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738098 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:56.738444 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738101 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:56.738444 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738106 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:56.738444 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738109 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:56.738444 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738111 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:56.738444 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738118 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:56.738444 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738121 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:56.738444 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738124 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:56.738444 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738127 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:56.738444 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738129 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:56.738444 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738132 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738551 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738557 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738560 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738563 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738565 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738568 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738570 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738573 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738576 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738578 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738580 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738583 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738586 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738588 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738591 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738594 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738596 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738598 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738601 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:56.738895 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738603 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738606 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738608 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738611 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738613 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738615 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738618 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738620 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738623 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738625 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738627 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738630 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738633 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738635 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738638 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738641 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738643 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738646 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738649 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:56.739392 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738651 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738653 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738656 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738658 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738661 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738663 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738665 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738668 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738670 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738673 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738675 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738679 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738682 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738685 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738687 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738690 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738692 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738694 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738697 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738699 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:56.739868 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738701 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738704 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738706 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738708 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738711 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738714 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738716 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738719 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738721 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738724 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738728 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738732 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738735 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738738 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738741 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738743 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738746 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738748 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738752 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:56.740375 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738754 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738757 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738760 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738762 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738764 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738767 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738769 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738772 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.738774 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739414 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739422 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739428 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739432 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739436 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739440 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739444 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739449 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739452 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739455 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739458 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739462 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739465 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 20:05:56.740846 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739468 2573 flags.go:64] FLAG: --cgroup-root="" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739471 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739474 2573 flags.go:64] FLAG: --client-ca-file="" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739477 2573 flags.go:64] FLAG: --cloud-config="" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739479 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739482 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739488 2573 flags.go:64] FLAG: --cluster-domain="" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739491 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739494 2573 flags.go:64] FLAG: --config-dir="" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739497 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739500 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739504 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739507 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739510 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739513 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739516 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739519 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739522 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739525 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739528 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739533 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739536 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739539 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739542 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739545 2573 flags.go:64] FLAG: --enable-server="true" Apr 20 20:05:56.741395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739548 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739552 2573 flags.go:64] FLAG: --event-burst="100" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739555 2573 flags.go:64] FLAG: --event-qps="50" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739558 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739561 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739565 2573 flags.go:64] FLAG: --eviction-hard="" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739568 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739571 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739574 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739577 2573 flags.go:64] FLAG: --eviction-soft="" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739580 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739583 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739587 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739590 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739594 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739597 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739599 2573 flags.go:64] FLAG: --feature-gates="" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739603 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739606 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739609 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739612 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739615 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739618 2573 flags.go:64] FLAG: --help="false" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739621 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-136-158.ec2.internal" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739624 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 20:05:56.742005 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739627 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739630 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739633 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739637 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739640 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739642 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739645 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739648 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739651 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739654 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739657 2573 flags.go:64] FLAG: --kube-reserved="" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739660 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739662 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739666 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739668 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739671 2573 flags.go:64] FLAG: --lock-file="" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739674 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739676 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739679 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739685 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739688 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739691 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739695 2573 flags.go:64] FLAG: --logging-format="text" Apr 20 20:05:56.742625 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739698 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739701 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739704 2573 flags.go:64] FLAG: --manifest-url="" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739707 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739711 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739714 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739718 2573 flags.go:64] FLAG: --max-pods="110" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739721 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739724 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739727 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739730 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739735 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739738 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739741 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739749 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739752 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739755 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739758 2573 flags.go:64] FLAG: --pod-cidr="" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739761 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739767 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739770 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739773 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739776 2573 flags.go:64] FLAG: --port="10250" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739779 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 20:05:56.743211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739781 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0460ae2cab79d9555" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739785 2573 flags.go:64] FLAG: --qos-reserved="" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739788 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739790 2573 flags.go:64] FLAG: --register-node="true" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739793 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739796 2573 flags.go:64] FLAG: --register-with-taints="" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739799 2573 flags.go:64] FLAG: --registry-burst="10" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739802 2573 flags.go:64] FLAG: --registry-qps="5" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739805 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739808 2573 flags.go:64] FLAG: --reserved-memory="" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739812 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739815 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739818 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739821 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739824 2573 flags.go:64] FLAG: --runonce="false" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739826 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739829 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739832 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739835 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739838 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739842 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739846 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739849 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739852 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739855 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739857 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 20:05:56.743825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739860 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739863 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739866 2573 flags.go:64] FLAG: --system-cgroups="" Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739869 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739874 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739877 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739880 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739884 2573 flags.go:64] FLAG: --tls-min-version="" Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739887 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739890 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739893 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739896 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739899 2573 flags.go:64] FLAG: --v="2" Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739903 2573 flags.go:64] FLAG: --version="false" Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739907 2573 flags.go:64] FLAG: --vmodule="" Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739912 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.739915 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740012 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740016 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740019 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740023 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740026 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740028 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:56.744479 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740030 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740033 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740035 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740039 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740042 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740044 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740047 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740050 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740052 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740055 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740057 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740060 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740063 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740066 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740069 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740072 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740075 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740077 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740080 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740082 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:56.745059 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740084 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740087 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740089 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740092 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740094 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740098 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740101 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740103 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740105 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740108 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740110 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740113 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740115 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740118 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740120 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740124 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740128 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740131 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740133 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740136 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:56.745586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740139 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740141 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740144 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740146 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740149 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740151 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740154 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740156 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740158 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740161 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740163 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740166 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740168 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740171 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740173 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740176 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740178 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740180 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740184 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740187 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:56.746074 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740189 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740191 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740194 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740196 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740199 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740201 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740203 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740207 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740211 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740215 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740218 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740221 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740224 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740227 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740230 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740232 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740235 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740238 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740240 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:56.746875 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.740243 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:56.747696 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.740988 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:56.748099 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.748076 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 20:05:56.748159 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.748100 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 20:05:56.748210 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748169 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:56.748210 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748178 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:56.748210 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748183 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:56.748210 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748187 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:56.748210 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748191 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:56.748210 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748195 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:56.748210 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748199 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:56.748210 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748203 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:56.748210 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748207 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:56.748210 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748211 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:56.748210 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748215 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748219 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748223 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748227 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748231 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748235 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748239 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748243 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748248 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748251 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748255 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748259 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748263 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748267 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748271 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748275 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748279 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748283 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748287 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748311 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:56.748738 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748315 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748321 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748326 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748330 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748333 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748337 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748341 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748345 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748349 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748353 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748360 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748367 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748372 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748376 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748381 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748386 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748390 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748394 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748398 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748403 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:56.749502 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748407 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748411 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748415 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748419 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748423 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748426 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748430 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748434 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748438 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748442 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748446 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748450 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748454 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748459 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748464 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748468 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748472 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748476 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748480 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:56.750066 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748485 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:56.750712 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748489 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:56.750712 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748493 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:56.750712 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748497 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:56.750712 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748501 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:56.750712 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748505 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:56.750712 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748511 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:56.750712 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748518 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:56.750712 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748523 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:56.750712 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748527 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:56.750712 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748531 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:56.750712 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748536 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:56.750712 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748540 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:56.750712 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748544 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:56.750712 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748548 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:56.750712 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748552 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:56.750712 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748556 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:56.751105 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.748564 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:56.751105 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748726 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:56.751105 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748735 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:56.751105 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748740 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:56.751105 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748744 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:56.751105 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748748 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:56.751105 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748753 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:56.751105 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748756 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:56.751105 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748761 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:56.751105 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748765 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:56.751105 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748770 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:56.751105 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748775 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:56.751105 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748779 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:56.751105 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748783 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:56.751105 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748787 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:56.751105 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748791 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748795 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748799 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748803 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748809 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748814 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748818 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748823 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748828 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748831 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748836 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748840 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748844 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748848 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748852 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748855 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748860 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748864 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748868 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:56.751646 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748873 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748877 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748880 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748884 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748888 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748892 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748896 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748900 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748905 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748909 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748914 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748918 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748922 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748926 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748930 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748934 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748938 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748942 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748946 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748950 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:56.752103 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748954 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748958 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748963 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748966 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748971 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748975 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748979 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748982 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748987 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748990 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748994 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.748998 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749002 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749007 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749011 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749015 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749019 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749023 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749027 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749031 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:56.752742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749035 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:56.753560 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749039 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:56.753560 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749043 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:56.753560 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749049 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:56.753560 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749053 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:56.753560 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749057 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:56.753560 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749064 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:56.753560 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749069 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:56.753560 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749074 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:56.753560 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749078 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:56.753560 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749083 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:56.753560 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749087 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:56.753560 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:56.749091 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:56.753560 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.749099 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:56.753560 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.749874 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 20:05:56.753979 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.753964 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 20:05:56.754866 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.754853 2573 server.go:1019] "Starting client certificate rotation" Apr 20 20:05:56.754970 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.754953 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:05:56.755004 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.754996 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:05:56.777579 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.777561 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:05:56.779871 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.779848 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:05:56.793346 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.793329 2573 log.go:25] "Validated CRI v1 runtime API" Apr 20 20:05:56.803871 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.803853 2573 log.go:25] "Validated CRI v1 image API" Apr 20 20:05:56.805164 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.805139 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 20:05:56.808283 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.808265 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:05:56.809602 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.809585 2573 fs.go:135] Filesystem UUIDs: map[32c9be79-ce9a-4b18-b17f-88b9537fa4ce:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 de2b5003-eb30-4998-aa7d-e16972d0069f:/dev/nvme0n1p3] Apr 20 20:05:56.809650 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.809603 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 20:05:56.815331 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.815201 2573 manager.go:217] Machine: {Timestamp:2026-04-20 20:05:56.813559198 +0000 UTC m=+0.363425802 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098912 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec259146f7024221f7a18dae77f5f47f SystemUUID:ec259146-f702-4221-f7a1-8dae77f5f47f BootID:196bdd88-1c0a-4d2d-8c32-6d1e82c9d744 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:54:77:29:45:09 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:54:77:29:45:09 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6a:39:53:28:00:e7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 20:05:56.815331 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.815327 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 20:05:56.815441 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.815399 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 20:05:56.817684 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.817658 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 20:05:56.817809 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.817687 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-158.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 20:05:56.817856 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.817818 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 20:05:56.817856 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.817827 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 20:05:56.817856 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.817839 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:05:56.819082 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.819071 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:05:56.819796 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.819786 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:05:56.819903 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.819894 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 20:05:56.821971 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.821961 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 20 20:05:56.822013 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.821974 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 20:05:56.822013 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.821990 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 20:05:56.822013 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.821998 2573 kubelet.go:397] "Adding apiserver pod source" Apr 20 20:05:56.822013 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.822006 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 20:05:56.822965 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.822952 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:05:56.823012 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.822970 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:05:56.825481 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.825462 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 20:05:56.827190 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.827177 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 20:05:56.828380 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.828369 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 20:05:56.828429 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.828386 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 20:05:56.828429 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.828392 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 20:05:56.828429 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.828397 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 20:05:56.828429 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.828403 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 20:05:56.828429 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.828408 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 20:05:56.828429 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.828414 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 20:05:56.828429 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.828420 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 20:05:56.828429 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.828426 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 20:05:56.828429 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.828433 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 20:05:56.828651 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.828441 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 20:05:56.828651 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.828450 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 20:05:56.830034 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.830023 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 20:05:56.830034 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.830034 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 20:05:56.833201 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.833186 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 20:05:56.833269 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.833222 2573 server.go:1295] "Started kubelet" Apr 20 20:05:56.833342 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.833321 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 20:05:56.833403 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.833345 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 20:05:56.833454 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.833415 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 20:05:56.834077 ip-10-0-136-158 systemd[1]: Started Kubernetes Kubelet. Apr 20 20:05:56.834689 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.834677 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 20 20:05:56.834756 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.834697 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 20:05:56.835413 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:56.835368 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 20:05:56.838142 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:56.838091 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-158.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 20:05:56.838413 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.838390 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-158.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 20:05:56.842349 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.842325 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 20:05:56.842823 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.842787 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 20:05:56.843777 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.843756 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 20:05:56.843937 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.843925 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 20:05:56.844158 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.844138 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 20:05:56.844239 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.844201 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 20 20:05:56.844239 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.844209 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 20 20:05:56.844482 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.844465 2573 factory.go:55] Registering systemd factory Apr 20 20:05:56.844546 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.844491 2573 factory.go:223] Registration of the systemd container factory successfully Apr 20 20:05:56.844901 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.844885 2573 factory.go:153] Registering CRI-O factory Apr 20 20:05:56.845022 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.845003 2573 factory.go:223] Registration of the crio container factory successfully Apr 20 20:05:56.845105 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:56.844152 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-158.ec2.internal.18a82962aff407dc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-158.ec2.internal,UID:ip-10-0-136-158.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-158.ec2.internal,},FirstTimestamp:2026-04-20 20:05:56.833200092 +0000 UTC m=+0.383066697,LastTimestamp:2026-04-20 20:05:56.833200092 +0000 UTC m=+0.383066697,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-158.ec2.internal,}" Apr 20 20:05:56.845105 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.845065 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 20:05:56.845105 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.845094 2573 factory.go:103] Registering Raw factory Apr 20 20:05:56.845236 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.845110 2573 manager.go:1196] Started watching for new ooms in manager Apr 20 20:05:56.845236 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:56.845211 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 20 20:05:56.845736 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.845716 2573 manager.go:319] Starting recovery of all containers Apr 20 20:05:56.846075 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:56.846054 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 20:05:56.851607 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:56.851566 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-158.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 20:05:56.851683 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:56.851663 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 20:05:56.855700 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.855556 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kqxcw" Apr 20 20:05:56.855700 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.855564 2573 manager.go:324] Recovery completed Apr 20 20:05:56.859907 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.859895 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:56.862213 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.862199 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:56.862284 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.862230 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:56.862284 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.862240 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:56.862759 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.862745 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 20:05:56.862759 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.862758 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 20:05:56.862840 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.862773 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:05:56.863104 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.863091 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kqxcw" Apr 20 20:05:56.864393 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:56.864279 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-158.ec2.internal.18a82962b1aeb876 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-158.ec2.internal,UID:ip-10-0-136-158.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-136-158.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-136-158.ec2.internal,},FirstTimestamp:2026-04-20 20:05:56.862212214 +0000 UTC m=+0.412078818,LastTimestamp:2026-04-20 20:05:56.862212214 +0000 UTC m=+0.412078818,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-158.ec2.internal,}" Apr 20 20:05:56.864816 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.864801 2573 policy_none.go:49] "None policy: Start" Apr 20 20:05:56.864884 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.864822 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 20:05:56.864884 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.864834 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 20 20:05:56.900951 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.900936 2573 manager.go:341] "Starting Device Plugin manager" Apr 20 20:05:56.912731 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:56.900978 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 20:05:56.912731 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.900991 2573 server.go:85] "Starting device plugin registration server" Apr 20 20:05:56.912731 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.901216 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 20:05:56.912731 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.901229 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 20:05:56.912731 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.901327 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 20:05:56.912731 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.901404 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 20:05:56.912731 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.901413 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 20:05:56.912731 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:56.901983 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 20:05:56.912731 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:56.902021 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-158.ec2.internal\" not found" Apr 20 20:05:56.971175 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.971147 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 20:05:56.972403 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.972390 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 20:05:56.972463 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.972416 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 20:05:56.972463 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.972432 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 20:05:56.972463 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.972438 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 20:05:56.972606 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:56.972470 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 20:05:56.975606 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:56.975590 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:57.001523 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.001467 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:57.002321 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.002285 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:57.002406 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.002339 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:57.002406 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.002354 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:57.002406 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.002381 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.009459 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.009446 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.009520 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:57.009465 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-158.ec2.internal\": node \"ip-10-0-136-158.ec2.internal\" not found" Apr 20 20:05:57.029024 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:57.029001 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 20 20:05:57.073211 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.073188 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal"] Apr 20 20:05:57.073283 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.073243 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:57.075869 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.075845 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:57.075937 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.075878 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:57.075937 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.075887 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:57.077096 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.077085 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:57.077268 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.077256 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.077328 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.077285 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:57.078269 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.078249 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:57.078371 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.078273 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:57.078371 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.078286 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:57.078371 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.078256 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:57.078371 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.078329 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:57.078371 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.078341 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:57.079465 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.079452 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.079512 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.079483 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:57.080369 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.080348 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:57.080369 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.080367 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:57.080465 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.080377 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:57.094807 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:57.094781 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-158.ec2.internal\" not found" node="ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.098919 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:57.098904 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-158.ec2.internal\" not found" node="ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.129816 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:57.129798 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 20 20:05:57.146469 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.146450 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c2eebab4ba0cac1e68c6bccde729de79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal\" (UID: \"c2eebab4ba0cac1e68c6bccde729de79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.146546 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.146499 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2eebab4ba0cac1e68c6bccde729de79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal\" (UID: \"c2eebab4ba0cac1e68c6bccde729de79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.146599 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.146555 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9b4440db6557536c217fdb95da13736d-config\") pod \"kube-apiserver-proxy-ip-10-0-136-158.ec2.internal\" (UID: \"9b4440db6557536c217fdb95da13736d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.230279 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:57.230247 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 20 20:05:57.247702 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.247683 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c2eebab4ba0cac1e68c6bccde729de79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal\" (UID: \"c2eebab4ba0cac1e68c6bccde729de79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.247766 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.247713 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2eebab4ba0cac1e68c6bccde729de79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal\" (UID: \"c2eebab4ba0cac1e68c6bccde729de79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.247766 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.247737 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9b4440db6557536c217fdb95da13736d-config\") pod \"kube-apiserver-proxy-ip-10-0-136-158.ec2.internal\" (UID: \"9b4440db6557536c217fdb95da13736d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.247853 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.247790 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9b4440db6557536c217fdb95da13736d-config\") pod \"kube-apiserver-proxy-ip-10-0-136-158.ec2.internal\" (UID: \"9b4440db6557536c217fdb95da13736d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.247853 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.247793 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2eebab4ba0cac1e68c6bccde729de79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal\" (UID: \"c2eebab4ba0cac1e68c6bccde729de79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.247853 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.247798 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c2eebab4ba0cac1e68c6bccde729de79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal\" (UID: \"c2eebab4ba0cac1e68c6bccde729de79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.331188 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:57.331095 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 20 20:05:57.396651 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.396622 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.401127 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.401113 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.431913 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:57.431881 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 20 20:05:57.532467 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:57.532429 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 20 20:05:57.633044 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:57.632978 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 20 20:05:57.733547 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:57.733512 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 20 20:05:57.754943 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.754924 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 20:05:57.755066 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.755052 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:05:57.833990 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:57.833909 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-158.ec2.internal\" not found" Apr 20 20:05:57.842470 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.842447 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 20:05:57.847130 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.847036 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:57.851463 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.851446 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:05:57.865588 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.865556 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 20:00:56 +0000 UTC" deadline="2028-01-21 03:07:15.379999618 +0000 UTC" Apr 20 20:05:57.865588 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.865583 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15367h1m17.514419975s" Apr 20 20:05:57.879934 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.879904 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-ttxmm" Apr 20 20:05:57.881906 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.881890 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:57.888368 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.888324 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-ttxmm" Apr 20 20:05:57.944898 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.944877 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.949077 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:57.949044 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2eebab4ba0cac1e68c6bccde729de79.slice/crio-3476f784c2be9f34c6efced2895b6604b74fa6ec80c78aa9127be39ce946a82a WatchSource:0}: Error finding container 3476f784c2be9f34c6efced2895b6604b74fa6ec80c78aa9127be39ce946a82a: Status 404 returned error can't find the container with id 3476f784c2be9f34c6efced2895b6604b74fa6ec80c78aa9127be39ce946a82a Apr 20 20:05:57.949591 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:57.949571 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b4440db6557536c217fdb95da13736d.slice/crio-1f4a3057e1f595a41c17533dc3c732154245ce8acea6fb1ad09e81ad9aa51d0a WatchSource:0}: Error finding container 1f4a3057e1f595a41c17533dc3c732154245ce8acea6fb1ad09e81ad9aa51d0a: Status 404 returned error can't find the container with id 1f4a3057e1f595a41c17533dc3c732154245ce8acea6fb1ad09e81ad9aa51d0a Apr 20 20:05:57.953654 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.953642 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:05:57.957776 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.957754 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:05:57.958624 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.958608 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal" Apr 20 20:05:57.965690 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.965675 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:05:57.975437 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.975382 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal" event={"ID":"9b4440db6557536c217fdb95da13736d","Type":"ContainerStarted","Data":"1f4a3057e1f595a41c17533dc3c732154245ce8acea6fb1ad09e81ad9aa51d0a"} Apr 20 20:05:57.976347 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:57.976327 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" event={"ID":"c2eebab4ba0cac1e68c6bccde729de79","Type":"ContainerStarted","Data":"3476f784c2be9f34c6efced2895b6604b74fa6ec80c78aa9127be39ce946a82a"} Apr 20 20:05:58.047847 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.047823 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:58.823223 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.823187 2573 apiserver.go:52] "Watching apiserver" Apr 20 20:05:58.832603 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.832571 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 20:05:58.833614 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.833585 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-t9khm","openshift-network-operator/iptables-alerter-m557q","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z","openshift-cluster-node-tuning-operator/tuned-zr8nd","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal","openshift-multus/multus-5r4lb","openshift-ovn-kubernetes/ovnkube-node-664zl","kube-system/konnectivity-agent-9rvtp","kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal","openshift-dns/node-resolver-4qgql","openshift-image-registry/node-ca-dt2xp","openshift-multus/multus-additional-cni-plugins-f9zzt","openshift-multus/network-metrics-daemon-d2g4l"] Apr 20 20:05:58.836525 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.836504 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:05:58.836626 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:58.836590 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9khm" podUID="5409aec2-613d-49b4-aad6-5dda25f70168" Apr 20 20:05:58.837912 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.837509 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m557q" Apr 20 20:05:58.838588 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.838537 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.840264 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.839696 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.840264 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.839791 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.840719 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.840696 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:05:58.841287 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.840872 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 20:05:58.841287 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.840950 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ffztq\"" Apr 20 20:05:58.841287 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.841152 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 20:05:58.841287 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.841159 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 20:05:58.841287 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.841242 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 20:05:58.841564 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.841536 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 20:05:58.841628 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.841608 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-vxb9h\"" Apr 20 20:05:58.841680 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.841659 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:05:58.842210 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.841964 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 20:05:58.842210 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.842018 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 20:05:58.842210 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.842070 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-wq42t\"" Apr 20 20:05:58.842210 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.842137 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 20:05:58.842490 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.842472 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 20:05:58.842534 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.842499 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-bj44p\"" Apr 20 20:05:58.842534 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.842480 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 20:05:58.843185 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.843167 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.844698 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.844231 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9rvtp" Apr 20 20:05:58.845188 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.845170 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 20:05:58.845278 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.845207 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 20:05:58.845515 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.845493 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 20:05:58.845689 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.845666 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4qgql" Apr 20 20:05:58.847004 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.846455 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 20:05:58.847004 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.846518 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 20:05:58.847004 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.846637 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 20:05:58.847004 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.846699 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 20:05:58.847004 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.846829 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 20:05:58.847004 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.846848 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fqqdn\"" Apr 20 20:05:58.847004 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.846974 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-9vxnk\"" Apr 20 20:05:58.847753 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.847730 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8zqck\"" Apr 20 20:05:58.847839 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.847783 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 20:05:58.847839 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.847792 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 20:05:58.848134 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.848114 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dt2xp" Apr 20 20:05:58.848220 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.848206 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:58.849605 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.849586 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:05:58.849694 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:58.849656 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d2g4l" podUID="9f213e16-074a-493b-b57c-f84483b57308" Apr 20 20:05:58.850000 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.849976 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 20:05:58.851270 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.850658 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 20:05:58.853257 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.853240 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 20:05:58.853474 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.853461 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-gbmxj\"" Apr 20 20:05:58.853669 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.853654 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 20:05:58.853733 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.853702 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-tpnm4\"" Apr 20 20:05:58.853825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.853807 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 20:05:58.857720 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.857690 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-host-var-lib-cni-multus\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.857822 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.857735 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6f6d52d5-73cf-459b-a235-e5cfe1d91c81-tmp-dir\") pod \"node-resolver-4qgql\" (UID: \"6f6d52d5-73cf-459b-a235-e5cfe1d91c81\") " pod="openshift-dns/node-resolver-4qgql" Apr 20 20:05:58.857885 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.857844 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-sysconfig\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.857937 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.857881 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-tmp\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.857988 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.857967 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-hostroot\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.858043 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.857994 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-lib-modules\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.858043 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.858018 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-host\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.858148 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.858046 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbd70467-a515-4215-9dc3-01d2315f4601-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.858148 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.858073 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-run\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.858148 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.858100 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cbd70467-a515-4215-9dc3-01d2315f4601-device-dir\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.858148 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.858128 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-host-run-multus-certs\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.858360 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.858158 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-kubernetes\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.858360 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.858180 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-sys\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.858360 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.858210 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-var-lib-openvswitch\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.858360 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.858234 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-etc-openvswitch\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.858360 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.858341 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-etc-kubernetes\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.858596 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.858366 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r886z\" (UniqueName: \"kubernetes.io/projected/6f6d52d5-73cf-459b-a235-e5cfe1d91c81-kube-api-access-r886z\") pod \"node-resolver-4qgql\" (UID: \"6f6d52d5-73cf-459b-a235-e5cfe1d91c81\") " pod="openshift-dns/node-resolver-4qgql" Apr 20 20:05:58.858596 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.858390 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5c20964-6b44-4902-91b4-e2f99aceca2f-env-overrides\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.858596 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.858441 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cbd70467-a515-4215-9dc3-01d2315f4601-registration-dir\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.858596 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.858490 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7g2g\" (UniqueName: \"kubernetes.io/projected/cbd70467-a515-4215-9dc3-01d2315f4601-kube-api-access-x7g2g\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.858596 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.858528 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-multus-cni-dir\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.858596 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.858584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-host-var-lib-cni-bin\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.858860 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.858610 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01f7cf40-a02f-4e4d-846f-75cdf011fbb1-host-slash\") pod \"iptables-alerter-m557q\" (UID: \"01f7cf40-a02f-4e4d-846f-75cdf011fbb1\") " pod="openshift-network-operator/iptables-alerter-m557q" Apr 20 20:05:58.858980 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.858932 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-tuned\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.859078 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.859033 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-slash\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.859138 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.859075 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5c20964-6b44-4902-91b4-e2f99aceca2f-ovnkube-config\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.859138 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.859107 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-systemd\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.859240 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.859137 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-run-ovn\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.860680 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.860598 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aa0188c0-4215-47d0-a910-5a4c74cbc7cc-serviceca\") pod \"node-ca-dt2xp\" (UID: \"aa0188c0-4215-47d0-a910-5a4c74cbc7cc\") " pod="openshift-image-registry/node-ca-dt2xp" Apr 20 20:05:58.860680 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.860653 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-sysctl-conf\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.860847 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.860695 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-cni-bin\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.860847 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.860733 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-system-cni-dir\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.860847 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.860765 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-var-lib-kubelet\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.860847 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.860813 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdxzb\" (UniqueName: \"kubernetes.io/projected/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-kube-api-access-rdxzb\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.861040 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.860865 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-kubelet\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.861040 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.860905 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-systemd-units\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.861040 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.860957 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-run-openvswitch\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.861040 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.860993 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-host-var-lib-kubelet\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.861234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861038 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zz2p\" (UniqueName: \"kubernetes.io/projected/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-kube-api-access-9zz2p\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.861234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861072 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-os-release\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.861234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861102 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-host-run-netns\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.861234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861134 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-modprobe-d\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.861234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861187 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-node-log\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.861234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861221 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-cni-netd\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.861612 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861252 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhvtn\" (UniqueName: \"kubernetes.io/projected/aa0188c0-4215-47d0-a910-5a4c74cbc7cc-kube-api-access-dhvtn\") pod \"node-ca-dt2xp\" (UID: \"aa0188c0-4215-47d0-a910-5a4c74cbc7cc\") " pod="openshift-image-registry/node-ca-dt2xp" Apr 20 20:05:58.861612 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861282 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cbd70467-a515-4215-9dc3-01d2315f4601-etc-selinux\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.861612 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861326 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-cnibin\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.861612 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861357 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/309654ff-b783-421b-91bd-5ae144783aa3-konnectivity-ca\") pod \"konnectivity-agent-9rvtp\" (UID: \"309654ff-b783-421b-91bd-5ae144783aa3\") " pod="kube-system/konnectivity-agent-9rvtp" Apr 20 20:05:58.861612 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861418 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-sysctl-d\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.861612 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861450 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-run-systemd\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.861612 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861492 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa0188c0-4215-47d0-a910-5a4c74cbc7cc-host\") pod \"node-ca-dt2xp\" (UID: \"aa0188c0-4215-47d0-a910-5a4c74cbc7cc\") " pod="openshift-image-registry/node-ca-dt2xp" Apr 20 20:05:58.861612 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861528 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-host-run-k8s-cni-cncf-io\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.861612 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861558 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-multus-daemon-config\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.861612 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861598 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/01f7cf40-a02f-4e4d-846f-75cdf011fbb1-iptables-alerter-script\") pod \"iptables-alerter-m557q\" (UID: \"01f7cf40-a02f-4e4d-846f-75cdf011fbb1\") " pod="openshift-network-operator/iptables-alerter-m557q" Apr 20 20:05:58.862069 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861621 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmf4v\" (UniqueName: \"kubernetes.io/projected/c5c20964-6b44-4902-91b4-e2f99aceca2f-kube-api-access-wmf4v\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.862069 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861649 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-multus-socket-dir-parent\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.862069 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861676 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-multus-conf-dir\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.862069 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861719 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6f6d52d5-73cf-459b-a235-e5cfe1d91c81-hosts-file\") pod \"node-resolver-4qgql\" (UID: \"6f6d52d5-73cf-459b-a235-e5cfe1d91c81\") " pod="openshift-dns/node-resolver-4qgql" Apr 20 20:05:58.862069 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861748 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cbd70467-a515-4215-9dc3-01d2315f4601-socket-dir\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.862069 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861773 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqv46\" (UniqueName: \"kubernetes.io/projected/01f7cf40-a02f-4e4d-846f-75cdf011fbb1-kube-api-access-tqv46\") pod \"iptables-alerter-m557q\" (UID: \"01f7cf40-a02f-4e4d-846f-75cdf011fbb1\") " pod="openshift-network-operator/iptables-alerter-m557q" Apr 20 20:05:58.862069 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861802 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-run-netns\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.862069 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861833 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-log-socket\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.862069 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861889 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-run-ovn-kubernetes\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.862069 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861935 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.862069 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.861967 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5c20964-6b44-4902-91b4-e2f99aceca2f-ovnkube-script-lib\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.862069 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.862006 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/309654ff-b783-421b-91bd-5ae144783aa3-agent-certs\") pod \"konnectivity-agent-9rvtp\" (UID: \"309654ff-b783-421b-91bd-5ae144783aa3\") " pod="kube-system/konnectivity-agent-9rvtp" Apr 20 20:05:58.862069 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.862034 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cbd70467-a515-4215-9dc3-01d2315f4601-sys-fs\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.862069 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.862057 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-cni-binary-copy\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.862739 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.862139 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5c20964-6b44-4902-91b4-e2f99aceca2f-ovn-node-metrics-cert\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.862739 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.862176 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrt8k\" (UniqueName: \"kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k\") pod \"network-check-target-t9khm\" (UID: \"5409aec2-613d-49b4-aad6-5dda25f70168\") " pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:05:58.889094 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.889064 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:00:57 +0000 UTC" deadline="2027-11-04 18:45:17.375566282 +0000 UTC" Apr 20 20:05:58.889094 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.889093 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13510h39m18.486475534s" Apr 20 20:05:58.944890 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.944865 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 20:05:58.962575 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.962541 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5c20964-6b44-4902-91b4-e2f99aceca2f-env-overrides\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.962772 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.962583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cbd70467-a515-4215-9dc3-01d2315f4601-registration-dir\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.962772 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.962609 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7g2g\" (UniqueName: \"kubernetes.io/projected/cbd70467-a515-4215-9dc3-01d2315f4601-kube-api-access-x7g2g\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.962772 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.962634 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-multus-cni-dir\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.962772 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.962654 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-host-var-lib-cni-bin\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.962772 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.962675 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01f7cf40-a02f-4e4d-846f-75cdf011fbb1-host-slash\") pod \"iptables-alerter-m557q\" (UID: \"01f7cf40-a02f-4e4d-846f-75cdf011fbb1\") " pod="openshift-network-operator/iptables-alerter-m557q" Apr 20 20:05:58.962772 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.962695 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-tuned\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.962772 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.962721 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-slash\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.962772 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.962744 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5c20964-6b44-4902-91b4-e2f99aceca2f-ovnkube-config\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.963145 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.962773 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99945c0f-09c6-48fb-84b1-4299b5936bd6-cnibin\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:58.963145 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.962802 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-systemd\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.963145 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.962827 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-run-ovn\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.963145 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.962850 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aa0188c0-4215-47d0-a910-5a4c74cbc7cc-serviceca\") pod \"node-ca-dt2xp\" (UID: \"aa0188c0-4215-47d0-a910-5a4c74cbc7cc\") " pod="openshift-image-registry/node-ca-dt2xp" Apr 20 20:05:58.963145 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.962897 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-sysctl-conf\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.963145 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.962923 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-cni-bin\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.963145 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.962954 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-system-cni-dir\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.963145 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963024 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9z6x\" (UniqueName: \"kubernetes.io/projected/9f213e16-074a-493b-b57c-f84483b57308-kube-api-access-b9z6x\") pod \"network-metrics-daemon-d2g4l\" (UID: \"9f213e16-074a-493b-b57c-f84483b57308\") " pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:05:58.963145 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963070 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-var-lib-kubelet\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.963145 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963082 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5c20964-6b44-4902-91b4-e2f99aceca2f-env-overrides\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.963145 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963096 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdxzb\" (UniqueName: \"kubernetes.io/projected/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-kube-api-access-rdxzb\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.963145 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963124 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-kubelet\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.963723 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963158 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cbd70467-a515-4215-9dc3-01d2315f4601-registration-dir\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.963723 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-systemd-units\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.963723 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963190 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-kubelet\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.963723 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963197 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-run-openvswitch\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.963723 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963383 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-system-cni-dir\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.963723 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963404 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-var-lib-kubelet\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.963723 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963528 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-multus-cni-dir\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.963723 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963527 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-sysctl-conf\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.963723 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963572 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-host-var-lib-kubelet\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.963723 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963603 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zz2p\" (UniqueName: \"kubernetes.io/projected/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-kube-api-access-9zz2p\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.963723 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963612 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-cni-bin\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.963723 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963655 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-host-var-lib-kubelet\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.963723 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963658 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-host-var-lib-cni-bin\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.963723 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963656 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/99945c0f-09c6-48fb-84b1-4299b5936bd6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:58.963723 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963680 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5c20964-6b44-4902-91b4-e2f99aceca2f-ovnkube-config\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.963723 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963681 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aa0188c0-4215-47d0-a910-5a4c74cbc7cc-serviceca\") pod \"node-ca-dt2xp\" (UID: \"aa0188c0-4215-47d0-a910-5a4c74cbc7cc\") " pod="openshift-image-registry/node-ca-dt2xp" Apr 20 20:05:58.963723 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963690 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-run-openvswitch\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.963723 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963718 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-systemd-units\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.964433 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963721 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-slash\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.964433 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963747 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-systemd\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.964433 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963756 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01f7cf40-a02f-4e4d-846f-75cdf011fbb1-host-slash\") pod \"iptables-alerter-m557q\" (UID: \"01f7cf40-a02f-4e4d-846f-75cdf011fbb1\") " pod="openshift-network-operator/iptables-alerter-m557q" Apr 20 20:05:58.964433 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963757 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-os-release\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.964433 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963789 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-host-run-netns\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.964433 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963789 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-run-ovn\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.964433 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963819 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99945c0f-09c6-48fb-84b1-4299b5936bd6-system-cni-dir\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:58.964433 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963864 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-os-release\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.964433 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963906 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-host-run-netns\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.964433 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963944 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-modprobe-d\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.964433 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963970 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-node-log\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.964433 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.963989 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-cni-netd\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.964433 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964033 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-cni-netd\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.964433 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964038 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-node-log\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.964433 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964043 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-modprobe-d\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.964433 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964063 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhvtn\" (UniqueName: \"kubernetes.io/projected/aa0188c0-4215-47d0-a910-5a4c74cbc7cc-kube-api-access-dhvtn\") pod \"node-ca-dt2xp\" (UID: \"aa0188c0-4215-47d0-a910-5a4c74cbc7cc\") " pod="openshift-image-registry/node-ca-dt2xp" Apr 20 20:05:58.964433 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964014 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 20:05:58.964433 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964084 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cbd70467-a515-4215-9dc3-01d2315f4601-etc-selinux\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.965234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964139 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-cnibin\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.965234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964164 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/309654ff-b783-421b-91bd-5ae144783aa3-konnectivity-ca\") pod \"konnectivity-agent-9rvtp\" (UID: \"309654ff-b783-421b-91bd-5ae144783aa3\") " pod="kube-system/konnectivity-agent-9rvtp" Apr 20 20:05:58.965234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964191 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs\") pod \"network-metrics-daemon-d2g4l\" (UID: \"9f213e16-074a-493b-b57c-f84483b57308\") " pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:05:58.965234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964217 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-sysctl-d\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.965234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964233 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-cnibin\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.965234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964243 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-run-systemd\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.965234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964278 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa0188c0-4215-47d0-a910-5a4c74cbc7cc-host\") pod \"node-ca-dt2xp\" (UID: \"aa0188c0-4215-47d0-a910-5a4c74cbc7cc\") " pod="openshift-image-registry/node-ca-dt2xp" Apr 20 20:05:58.965234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964347 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cbd70467-a515-4215-9dc3-01d2315f4601-etc-selinux\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.965234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964353 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-host-run-k8s-cni-cncf-io\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.965234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964381 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa0188c0-4215-47d0-a910-5a4c74cbc7cc-host\") pod \"node-ca-dt2xp\" (UID: \"aa0188c0-4215-47d0-a910-5a4c74cbc7cc\") " pod="openshift-image-registry/node-ca-dt2xp" Apr 20 20:05:58.965234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964405 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-multus-daemon-config\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.965234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964447 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-host-run-k8s-cni-cncf-io\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.965234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964474 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-sysctl-d\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.965234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964502 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/01f7cf40-a02f-4e4d-846f-75cdf011fbb1-iptables-alerter-script\") pod \"iptables-alerter-m557q\" (UID: \"01f7cf40-a02f-4e4d-846f-75cdf011fbb1\") " pod="openshift-network-operator/iptables-alerter-m557q" Apr 20 20:05:58.965234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964517 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-run-systemd\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.965234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964545 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmf4v\" (UniqueName: \"kubernetes.io/projected/c5c20964-6b44-4902-91b4-e2f99aceca2f-kube-api-access-wmf4v\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.965234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964574 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-multus-socket-dir-parent\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.965234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-multus-conf-dir\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.966062 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964655 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6f6d52d5-73cf-459b-a235-e5cfe1d91c81-hosts-file\") pod \"node-resolver-4qgql\" (UID: \"6f6d52d5-73cf-459b-a235-e5cfe1d91c81\") " pod="openshift-dns/node-resolver-4qgql" Apr 20 20:05:58.966062 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964685 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99945c0f-09c6-48fb-84b1-4299b5936bd6-cni-binary-copy\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:58.966062 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964842 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/99945c0f-09c6-48fb-84b1-4299b5936bd6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:58.966062 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964871 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cbd70467-a515-4215-9dc3-01d2315f4601-socket-dir\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.966062 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964897 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqv46\" (UniqueName: \"kubernetes.io/projected/01f7cf40-a02f-4e4d-846f-75cdf011fbb1-kube-api-access-tqv46\") pod \"iptables-alerter-m557q\" (UID: \"01f7cf40-a02f-4e4d-846f-75cdf011fbb1\") " pod="openshift-network-operator/iptables-alerter-m557q" Apr 20 20:05:58.966062 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964911 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6f6d52d5-73cf-459b-a235-e5cfe1d91c81-hosts-file\") pod \"node-resolver-4qgql\" (UID: \"6f6d52d5-73cf-459b-a235-e5cfe1d91c81\") " pod="openshift-dns/node-resolver-4qgql" Apr 20 20:05:58.966062 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964922 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-run-netns\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.966062 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964988 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-multus-conf-dir\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.966062 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.964999 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-multus-daemon-config\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.966062 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965029 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/01f7cf40-a02f-4e4d-846f-75cdf011fbb1-iptables-alerter-script\") pod \"iptables-alerter-m557q\" (UID: \"01f7cf40-a02f-4e4d-846f-75cdf011fbb1\") " pod="openshift-network-operator/iptables-alerter-m557q" Apr 20 20:05:58.966062 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965041 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-run-netns\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.966062 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965046 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-log-socket\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.966062 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965074 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-run-ovn-kubernetes\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.966062 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965134 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cbd70467-a515-4215-9dc3-01d2315f4601-socket-dir\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.966062 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965180 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.966062 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965190 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-log-socket\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.966062 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965214 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5c20964-6b44-4902-91b4-e2f99aceca2f-ovnkube-script-lib\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.966955 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/309654ff-b783-421b-91bd-5ae144783aa3-agent-certs\") pod \"konnectivity-agent-9rvtp\" (UID: \"309654ff-b783-421b-91bd-5ae144783aa3\") " pod="kube-system/konnectivity-agent-9rvtp" Apr 20 20:05:58.966955 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965241 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.966955 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965284 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cbd70467-a515-4215-9dc3-01d2315f4601-sys-fs\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.966955 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965331 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-cni-binary-copy\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.966955 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965352 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-host-run-ovn-kubernetes\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.966955 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965400 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5c20964-6b44-4902-91b4-e2f99aceca2f-ovn-node-metrics-cert\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.966955 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965429 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrt8k\" (UniqueName: \"kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k\") pod \"network-check-target-t9khm\" (UID: \"5409aec2-613d-49b4-aad6-5dda25f70168\") " pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:05:58.966955 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/309654ff-b783-421b-91bd-5ae144783aa3-konnectivity-ca\") pod \"konnectivity-agent-9rvtp\" (UID: \"309654ff-b783-421b-91bd-5ae144783aa3\") " pod="kube-system/konnectivity-agent-9rvtp" Apr 20 20:05:58.966955 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965458 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-host-var-lib-cni-multus\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.966955 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965456 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-multus-socket-dir-parent\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.966955 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965509 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6f6d52d5-73cf-459b-a235-e5cfe1d91c81-tmp-dir\") pod \"node-resolver-4qgql\" (UID: \"6f6d52d5-73cf-459b-a235-e5cfe1d91c81\") " pod="openshift-dns/node-resolver-4qgql" Apr 20 20:05:58.966955 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965559 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-host-var-lib-cni-multus\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.966955 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965564 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cbd70467-a515-4215-9dc3-01d2315f4601-sys-fs\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.966955 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965589 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99945c0f-09c6-48fb-84b1-4299b5936bd6-os-release\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:58.966955 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965875 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6f6d52d5-73cf-459b-a235-e5cfe1d91c81-tmp-dir\") pod \"node-resolver-4qgql\" (UID: \"6f6d52d5-73cf-459b-a235-e5cfe1d91c81\") " pod="openshift-dns/node-resolver-4qgql" Apr 20 20:05:58.966955 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965929 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-sysconfig\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.966955 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.965957 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-tmp\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.967813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966001 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-sysconfig\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.967813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966024 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-hostroot\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.967813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966059 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99945c0f-09c6-48fb-84b1-4299b5936bd6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:58.967813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966079 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pt8k\" (UniqueName: \"kubernetes.io/projected/99945c0f-09c6-48fb-84b1-4299b5936bd6-kube-api-access-5pt8k\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:58.967813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966078 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-cni-binary-copy\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.967813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966117 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-hostroot\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.967813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966119 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-lib-modules\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.967813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966150 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-host\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.967813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966195 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-host\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.967813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966236 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-lib-modules\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.967813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966238 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbd70467-a515-4215-9dc3-01d2315f4601-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.967813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966270 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-run\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.967813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966309 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cbd70467-a515-4215-9dc3-01d2315f4601-device-dir\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.967813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966346 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-run\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.967813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966349 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbd70467-a515-4215-9dc3-01d2315f4601-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.967813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966349 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-host-run-multus-certs\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.967813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966393 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cbd70467-a515-4215-9dc3-01d2315f4601-device-dir\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.967813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966397 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-kubernetes\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.968668 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966425 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-sys\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.968668 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966432 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-host-run-multus-certs\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.968668 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-var-lib-openvswitch\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.968668 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966462 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-kubernetes\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.968668 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966475 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-sys\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.968668 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966477 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-etc-openvswitch\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.968668 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966501 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-var-lib-openvswitch\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.968668 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-etc-kubernetes\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.968668 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966532 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r886z\" (UniqueName: \"kubernetes.io/projected/6f6d52d5-73cf-459b-a235-e5cfe1d91c81-kube-api-access-r886z\") pod \"node-resolver-4qgql\" (UID: \"6f6d52d5-73cf-459b-a235-e5cfe1d91c81\") " pod="openshift-dns/node-resolver-4qgql" Apr 20 20:05:58.968668 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966537 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5c20964-6b44-4902-91b4-e2f99aceca2f-etc-openvswitch\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.968668 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966586 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-etc-kubernetes\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.968668 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.966902 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5c20964-6b44-4902-91b4-e2f99aceca2f-ovnkube-script-lib\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.968668 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.968498 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-etc-tuned\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.968668 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.968672 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-tmp\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.969100 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.968739 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/309654ff-b783-421b-91bd-5ae144783aa3-agent-certs\") pod \"konnectivity-agent-9rvtp\" (UID: \"309654ff-b783-421b-91bd-5ae144783aa3\") " pod="kube-system/konnectivity-agent-9rvtp" Apr 20 20:05:58.969100 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.968931 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5c20964-6b44-4902-91b4-e2f99aceca2f-ovn-node-metrics-cert\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.974995 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.973532 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmf4v\" (UniqueName: \"kubernetes.io/projected/c5c20964-6b44-4902-91b4-e2f99aceca2f-kube-api-access-wmf4v\") pod \"ovnkube-node-664zl\" (UID: \"c5c20964-6b44-4902-91b4-e2f99aceca2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:58.974995 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.973541 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhvtn\" (UniqueName: \"kubernetes.io/projected/aa0188c0-4215-47d0-a910-5a4c74cbc7cc-kube-api-access-dhvtn\") pod \"node-ca-dt2xp\" (UID: \"aa0188c0-4215-47d0-a910-5a4c74cbc7cc\") " pod="openshift-image-registry/node-ca-dt2xp" Apr 20 20:05:58.975187 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.975088 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zz2p\" (UniqueName: \"kubernetes.io/projected/f3332e18-c5c0-47a4-a5ed-4b719b4bc831-kube-api-access-9zz2p\") pod \"multus-5r4lb\" (UID: \"f3332e18-c5c0-47a4-a5ed-4b719b4bc831\") " pod="openshift-multus/multus-5r4lb" Apr 20 20:05:58.976077 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.975722 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7g2g\" (UniqueName: \"kubernetes.io/projected/cbd70467-a515-4215-9dc3-01d2315f4601-kube-api-access-x7g2g\") pod \"aws-ebs-csi-driver-node-wj97z\" (UID: \"cbd70467-a515-4215-9dc3-01d2315f4601\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:58.976182 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.976129 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdxzb\" (UniqueName: \"kubernetes.io/projected/7f785e5a-c4aa-40db-a581-8d086c1bf8cb-kube-api-access-rdxzb\") pod \"tuned-zr8nd\" (UID: \"7f785e5a-c4aa-40db-a581-8d086c1bf8cb\") " pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:58.976792 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.976764 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r886z\" (UniqueName: \"kubernetes.io/projected/6f6d52d5-73cf-459b-a235-e5cfe1d91c81-kube-api-access-r886z\") pod \"node-resolver-4qgql\" (UID: \"6f6d52d5-73cf-459b-a235-e5cfe1d91c81\") " pod="openshift-dns/node-resolver-4qgql" Apr 20 20:05:58.977710 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:58.977425 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:58.977710 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:58.977697 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqv46\" (UniqueName: \"kubernetes.io/projected/01f7cf40-a02f-4e4d-846f-75cdf011fbb1-kube-api-access-tqv46\") pod \"iptables-alerter-m557q\" (UID: \"01f7cf40-a02f-4e4d-846f-75cdf011fbb1\") " pod="openshift-network-operator/iptables-alerter-m557q" Apr 20 20:05:58.977857 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:58.977716 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:58.977857 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:58.977734 2573 projected.go:194] Error preparing data for projected volume kube-api-access-wrt8k for pod openshift-network-diagnostics/network-check-target-t9khm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:58.977857 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:58.977814 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k podName:5409aec2-613d-49b4-aad6-5dda25f70168 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:59.477780211 +0000 UTC m=+3.027646826 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wrt8k" (UniqueName: "kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k") pod "network-check-target-t9khm" (UID: "5409aec2-613d-49b4-aad6-5dda25f70168") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:59.056710 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.056669 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:59.067050 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.067022 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/99945c0f-09c6-48fb-84b1-4299b5936bd6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:59.067050 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.067053 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99945c0f-09c6-48fb-84b1-4299b5936bd6-system-cni-dir\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:59.067256 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.067075 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs\") pod \"network-metrics-daemon-d2g4l\" (UID: \"9f213e16-074a-493b-b57c-f84483b57308\") " pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:05:59.067256 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.067106 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99945c0f-09c6-48fb-84b1-4299b5936bd6-cni-binary-copy\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:59.067256 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.067131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/99945c0f-09c6-48fb-84b1-4299b5936bd6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:59.067256 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.067176 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99945c0f-09c6-48fb-84b1-4299b5936bd6-os-release\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:59.067256 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.067204 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99945c0f-09c6-48fb-84b1-4299b5936bd6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:59.067256 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.067227 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pt8k\" (UniqueName: \"kubernetes.io/projected/99945c0f-09c6-48fb-84b1-4299b5936bd6-kube-api-access-5pt8k\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:59.067602 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.067268 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99945c0f-09c6-48fb-84b1-4299b5936bd6-cnibin\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:59.067602 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.067320 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9z6x\" (UniqueName: \"kubernetes.io/projected/9f213e16-074a-493b-b57c-f84483b57308-kube-api-access-b9z6x\") pod \"network-metrics-daemon-d2g4l\" (UID: \"9f213e16-074a-493b-b57c-f84483b57308\") " pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:05:59.067602 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.067173 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99945c0f-09c6-48fb-84b1-4299b5936bd6-system-cni-dir\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:59.067602 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.067423 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99945c0f-09c6-48fb-84b1-4299b5936bd6-os-release\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:59.067602 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.067542 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99945c0f-09c6-48fb-84b1-4299b5936bd6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:59.067602 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.067590 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99945c0f-09c6-48fb-84b1-4299b5936bd6-cnibin\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:59.067877 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:59.067674 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:59.067877 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.067676 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/99945c0f-09c6-48fb-84b1-4299b5936bd6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:59.067877 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.067705 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99945c0f-09c6-48fb-84b1-4299b5936bd6-cni-binary-copy\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:59.067877 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:59.067748 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs podName:9f213e16-074a-493b-b57c-f84483b57308 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:59.567732143 +0000 UTC m=+3.117598736 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs") pod "network-metrics-daemon-d2g4l" (UID: "9f213e16-074a-493b-b57c-f84483b57308") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:59.067877 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.067856 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/99945c0f-09c6-48fb-84b1-4299b5936bd6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:59.077361 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.077251 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9z6x\" (UniqueName: \"kubernetes.io/projected/9f213e16-074a-493b-b57c-f84483b57308-kube-api-access-b9z6x\") pod \"network-metrics-daemon-d2g4l\" (UID: \"9f213e16-074a-493b-b57c-f84483b57308\") " pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:05:59.077806 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.077788 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pt8k\" (UniqueName: \"kubernetes.io/projected/99945c0f-09c6-48fb-84b1-4299b5936bd6-kube-api-access-5pt8k\") pod \"multus-additional-cni-plugins-f9zzt\" (UID: \"99945c0f-09c6-48fb-84b1-4299b5936bd6\") " pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:59.159912 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.159885 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m557q" Apr 20 20:05:59.167738 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.167710 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" Apr 20 20:05:59.178435 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.178405 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" Apr 20 20:05:59.184143 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.184122 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5r4lb" Apr 20 20:05:59.191867 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.191847 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:05:59.199592 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.199571 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9rvtp" Apr 20 20:05:59.207118 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.207099 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4qgql" Apr 20 20:05:59.214631 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.214612 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dt2xp" Apr 20 20:05:59.220147 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.220130 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-f9zzt" Apr 20 20:05:59.566892 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:59.566867 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod309654ff_b783_421b_91bd_5ae144783aa3.slice/crio-eca7c0c009a6079cc9187c69e67938b0d8fa5f85b91e7f78f4f3c92be5a76a7a WatchSource:0}: Error finding container eca7c0c009a6079cc9187c69e67938b0d8fa5f85b91e7f78f4f3c92be5a76a7a: Status 404 returned error can't find the container with id eca7c0c009a6079cc9187c69e67938b0d8fa5f85b91e7f78f4f3c92be5a76a7a Apr 20 20:05:59.568222 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:59.568128 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3332e18_c5c0_47a4_a5ed_4b719b4bc831.slice/crio-3b8823bbb9760ffbcff5227daf8df4fb75de73ac8955f6985b9636795748dce8 WatchSource:0}: Error finding container 3b8823bbb9760ffbcff5227daf8df4fb75de73ac8955f6985b9636795748dce8: Status 404 returned error can't find the container with id 3b8823bbb9760ffbcff5227daf8df4fb75de73ac8955f6985b9636795748dce8 Apr 20 20:05:59.570233 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:59.570106 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01f7cf40_a02f_4e4d_846f_75cdf011fbb1.slice/crio-8a847f61340102cd559aeb0c5b06a59e304d5d2a97bcbe0093a57154f286fef1 WatchSource:0}: Error finding container 8a847f61340102cd559aeb0c5b06a59e304d5d2a97bcbe0093a57154f286fef1: Status 404 returned error can't find the container with id 8a847f61340102cd559aeb0c5b06a59e304d5d2a97bcbe0093a57154f286fef1 Apr 20 20:05:59.570835 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.570804 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs\") pod \"network-metrics-daemon-d2g4l\" (UID: \"9f213e16-074a-493b-b57c-f84483b57308\") " pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:05:59.570899 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.570856 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrt8k\" (UniqueName: \"kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k\") pod \"network-check-target-t9khm\" (UID: \"5409aec2-613d-49b4-aad6-5dda25f70168\") " pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:05:59.571011 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:59.570995 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:59.571068 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:59.571046 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs podName:9f213e16-074a-493b-b57c-f84483b57308 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:00.571030973 +0000 UTC m=+4.120897566 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs") pod "network-metrics-daemon-d2g4l" (UID: "9f213e16-074a-493b-b57c-f84483b57308") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:59.571068 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:59.570996 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:59.571068 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:59.571067 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:59.571215 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:59.571080 2573 projected.go:194] Error preparing data for projected volume kube-api-access-wrt8k for pod openshift-network-diagnostics/network-check-target-t9khm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:59.571215 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:59.571113 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k podName:5409aec2-613d-49b4-aad6-5dda25f70168 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:00.571103858 +0000 UTC m=+4.120970466 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wrt8k" (UniqueName: "kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k") pod "network-check-target-t9khm" (UID: "5409aec2-613d-49b4-aad6-5dda25f70168") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:59.575861 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:59.575605 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f785e5a_c4aa_40db_a581_8d086c1bf8cb.slice/crio-ef5741e545d56eae53807ce75987622e4194666958816b9fea7c86f891a7986e WatchSource:0}: Error finding container ef5741e545d56eae53807ce75987622e4194666958816b9fea7c86f891a7986e: Status 404 returned error can't find the container with id ef5741e545d56eae53807ce75987622e4194666958816b9fea7c86f891a7986e Apr 20 20:05:59.576406 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:59.576385 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5c20964_6b44_4902_91b4_e2f99aceca2f.slice/crio-3bd15ca7ffba10d9a9ab2566fadb7ba2cbf9fe242184293e2a7dded94ee1854a WatchSource:0}: Error finding container 3bd15ca7ffba10d9a9ab2566fadb7ba2cbf9fe242184293e2a7dded94ee1854a: Status 404 returned error can't find the container with id 3bd15ca7ffba10d9a9ab2566fadb7ba2cbf9fe242184293e2a7dded94ee1854a Apr 20 20:05:59.577146 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:59.577126 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbd70467_a515_4215_9dc3_01d2315f4601.slice/crio-446fc05c4722480a0ba2a0dfe5314e3543cd9f76841f1316f5bb8f745a71b3db WatchSource:0}: Error finding container 446fc05c4722480a0ba2a0dfe5314e3543cd9f76841f1316f5bb8f745a71b3db: Status 404 returned error can't find the container with id 446fc05c4722480a0ba2a0dfe5314e3543cd9f76841f1316f5bb8f745a71b3db Apr 20 20:05:59.579005 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:59.578868 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa0188c0_4215_47d0_a910_5a4c74cbc7cc.slice/crio-04a5cb3d58ed97441840db11693673c66e35b887a3536f56d8ec0aa47605963e WatchSource:0}: Error finding container 04a5cb3d58ed97441840db11693673c66e35b887a3536f56d8ec0aa47605963e: Status 404 returned error can't find the container with id 04a5cb3d58ed97441840db11693673c66e35b887a3536f56d8ec0aa47605963e Apr 20 20:05:59.579737 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:05:59.579714 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99945c0f_09c6_48fb_84b1_4299b5936bd6.slice/crio-958d84a2966fec67953fa9742ec1b5c9dbe0b53d286cc15d79c4f343b0567498 WatchSource:0}: Error finding container 958d84a2966fec67953fa9742ec1b5c9dbe0b53d286cc15d79c4f343b0567498: Status 404 returned error can't find the container with id 958d84a2966fec67953fa9742ec1b5c9dbe0b53d286cc15d79c4f343b0567498 Apr 20 20:05:59.741246 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.741071 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-wvq2j"] Apr 20 20:05:59.742722 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.742700 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:05:59.742837 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:59.742779 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvq2j" podUID="dd310fd8-ff36-47d6-9dbf-d8d029c30747" Apr 20 20:05:59.772038 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.772010 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dd310fd8-ff36-47d6-9dbf-d8d029c30747-kubelet-config\") pod \"global-pull-secret-syncer-wvq2j\" (UID: \"dd310fd8-ff36-47d6-9dbf-d8d029c30747\") " pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:05:59.772038 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.772043 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret\") pod \"global-pull-secret-syncer-wvq2j\" (UID: \"dd310fd8-ff36-47d6-9dbf-d8d029c30747\") " pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:05:59.772221 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.772069 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dd310fd8-ff36-47d6-9dbf-d8d029c30747-dbus\") pod \"global-pull-secret-syncer-wvq2j\" (UID: \"dd310fd8-ff36-47d6-9dbf-d8d029c30747\") " pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:05:59.873581 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.873475 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dd310fd8-ff36-47d6-9dbf-d8d029c30747-kubelet-config\") pod \"global-pull-secret-syncer-wvq2j\" (UID: \"dd310fd8-ff36-47d6-9dbf-d8d029c30747\") " pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:05:59.873581 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.873508 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret\") pod \"global-pull-secret-syncer-wvq2j\" (UID: \"dd310fd8-ff36-47d6-9dbf-d8d029c30747\") " pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:05:59.873581 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.873536 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dd310fd8-ff36-47d6-9dbf-d8d029c30747-dbus\") pod \"global-pull-secret-syncer-wvq2j\" (UID: \"dd310fd8-ff36-47d6-9dbf-d8d029c30747\") " pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:05:59.874112 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.873599 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dd310fd8-ff36-47d6-9dbf-d8d029c30747-kubelet-config\") pod \"global-pull-secret-syncer-wvq2j\" (UID: \"dd310fd8-ff36-47d6-9dbf-d8d029c30747\") " pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:05:59.874112 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:59.873644 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:59.874112 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.873699 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dd310fd8-ff36-47d6-9dbf-d8d029c30747-dbus\") pod \"global-pull-secret-syncer-wvq2j\" (UID: \"dd310fd8-ff36-47d6-9dbf-d8d029c30747\") " pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:05:59.874112 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:05:59.873706 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret podName:dd310fd8-ff36-47d6-9dbf-d8d029c30747 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:00.373689281 +0000 UTC m=+3.923555873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret") pod "global-pull-secret-syncer-wvq2j" (UID: "dd310fd8-ff36-47d6-9dbf-d8d029c30747") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:59.889790 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.889749 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:00:57 +0000 UTC" deadline="2027-12-07 06:12:53.966808959 +0000 UTC" Apr 20 20:05:59.889790 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.889785 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14290h6m54.077027385s" Apr 20 20:05:59.985483 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.985441 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal" event={"ID":"9b4440db6557536c217fdb95da13736d","Type":"ContainerStarted","Data":"272920c2e8345e3440e1c83baddcb3c429c41c133d6918f92b74053e2f7f1dd4"} Apr 20 20:05:59.989736 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.989685 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4qgql" event={"ID":"6f6d52d5-73cf-459b-a235-e5cfe1d91c81","Type":"ContainerStarted","Data":"121b8c710135fa1fff8411503c71e7facafa840cc4916a76a7afd1ef05c54b4b"} Apr 20 20:05:59.990973 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.990938 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9zzt" event={"ID":"99945c0f-09c6-48fb-84b1-4299b5936bd6","Type":"ContainerStarted","Data":"958d84a2966fec67953fa9742ec1b5c9dbe0b53d286cc15d79c4f343b0567498"} Apr 20 20:05:59.992016 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.991990 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dt2xp" event={"ID":"aa0188c0-4215-47d0-a910-5a4c74cbc7cc","Type":"ContainerStarted","Data":"04a5cb3d58ed97441840db11693673c66e35b887a3536f56d8ec0aa47605963e"} Apr 20 20:05:59.993255 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.993219 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" event={"ID":"c5c20964-6b44-4902-91b4-e2f99aceca2f","Type":"ContainerStarted","Data":"3bd15ca7ffba10d9a9ab2566fadb7ba2cbf9fe242184293e2a7dded94ee1854a"} Apr 20 20:05:59.994142 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.994115 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m557q" event={"ID":"01f7cf40-a02f-4e4d-846f-75cdf011fbb1","Type":"ContainerStarted","Data":"8a847f61340102cd559aeb0c5b06a59e304d5d2a97bcbe0093a57154f286fef1"} Apr 20 20:05:59.995161 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.995136 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5r4lb" event={"ID":"f3332e18-c5c0-47a4-a5ed-4b719b4bc831","Type":"ContainerStarted","Data":"3b8823bbb9760ffbcff5227daf8df4fb75de73ac8955f6985b9636795748dce8"} Apr 20 20:05:59.996474 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.996449 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" event={"ID":"cbd70467-a515-4215-9dc3-01d2315f4601","Type":"ContainerStarted","Data":"446fc05c4722480a0ba2a0dfe5314e3543cd9f76841f1316f5bb8f745a71b3db"} Apr 20 20:05:59.997628 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.997607 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" event={"ID":"7f785e5a-c4aa-40db-a581-8d086c1bf8cb","Type":"ContainerStarted","Data":"ef5741e545d56eae53807ce75987622e4194666958816b9fea7c86f891a7986e"} Apr 20 20:05:59.998719 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:05:59.998699 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9rvtp" event={"ID":"309654ff-b783-421b-91bd-5ae144783aa3","Type":"ContainerStarted","Data":"eca7c0c009a6079cc9187c69e67938b0d8fa5f85b91e7f78f4f3c92be5a76a7a"} Apr 20 20:06:00.378910 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:00.378860 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret\") pod \"global-pull-secret-syncer-wvq2j\" (UID: \"dd310fd8-ff36-47d6-9dbf-d8d029c30747\") " pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:00.379100 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:00.379083 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:00.379171 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:00.379151 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret podName:dd310fd8-ff36-47d6-9dbf-d8d029c30747 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:01.379131413 +0000 UTC m=+4.928998009 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret") pod "global-pull-secret-syncer-wvq2j" (UID: "dd310fd8-ff36-47d6-9dbf-d8d029c30747") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:00.589710 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:00.589482 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs\") pod \"network-metrics-daemon-d2g4l\" (UID: \"9f213e16-074a-493b-b57c-f84483b57308\") " pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:00.589710 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:00.589550 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrt8k\" (UniqueName: \"kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k\") pod \"network-check-target-t9khm\" (UID: \"5409aec2-613d-49b4-aad6-5dda25f70168\") " pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:00.589710 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:00.589688 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:00.589710 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:00.589708 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:00.590012 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:00.589720 2573 projected.go:194] Error preparing data for projected volume kube-api-access-wrt8k for pod openshift-network-diagnostics/network-check-target-t9khm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:00.590012 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:00.589782 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k podName:5409aec2-613d-49b4-aad6-5dda25f70168 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:02.589763534 +0000 UTC m=+6.139630131 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wrt8k" (UniqueName: "kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k") pod "network-check-target-t9khm" (UID: "5409aec2-613d-49b4-aad6-5dda25f70168") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:00.590584 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:00.590202 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:00.590584 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:00.590266 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs podName:9f213e16-074a-493b-b57c-f84483b57308 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:02.59025026 +0000 UTC m=+6.140116855 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs") pod "network-metrics-daemon-d2g4l" (UID: "9f213e16-074a-493b-b57c-f84483b57308") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:00.973785 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:00.973181 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:00.973785 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:00.973346 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d2g4l" podUID="9f213e16-074a-493b-b57c-f84483b57308" Apr 20 20:06:00.974536 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:00.974388 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:00.974536 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:00.974485 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9khm" podUID="5409aec2-613d-49b4-aad6-5dda25f70168" Apr 20 20:06:01.005790 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:01.005470 2573 generic.go:358] "Generic (PLEG): container finished" podID="c2eebab4ba0cac1e68c6bccde729de79" containerID="1ee33a2115d2ceb24cff95a77063abe8c2e1d5876ede1c58e9ec901cbb6a8b2b" exitCode=0 Apr 20 20:06:01.005790 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:01.005580 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" event={"ID":"c2eebab4ba0cac1e68c6bccde729de79","Type":"ContainerDied","Data":"1ee33a2115d2ceb24cff95a77063abe8c2e1d5876ede1c58e9ec901cbb6a8b2b"} Apr 20 20:06:01.020117 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:01.020060 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-158.ec2.internal" podStartSLOduration=4.02004159 podStartE2EDuration="4.02004159s" podCreationTimestamp="2026-04-20 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:06:00.003605467 +0000 UTC m=+3.553472081" watchObservedRunningTime="2026-04-20 20:06:01.02004159 +0000 UTC m=+4.569908205" Apr 20 20:06:01.396996 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:01.396957 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret\") pod \"global-pull-secret-syncer-wvq2j\" (UID: \"dd310fd8-ff36-47d6-9dbf-d8d029c30747\") " pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:01.397169 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:01.397115 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:01.397222 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:01.397203 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret podName:dd310fd8-ff36-47d6-9dbf-d8d029c30747 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:03.397161581 +0000 UTC m=+6.947028177 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret") pod "global-pull-secret-syncer-wvq2j" (UID: "dd310fd8-ff36-47d6-9dbf-d8d029c30747") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:01.973330 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:01.972903 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:01.973330 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:01.973043 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvq2j" podUID="dd310fd8-ff36-47d6-9dbf-d8d029c30747" Apr 20 20:06:02.028778 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:02.028003 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" event={"ID":"c2eebab4ba0cac1e68c6bccde729de79","Type":"ContainerStarted","Data":"ce3bd60f0c4614790b7e769793887b5d4762f47e277cb41010e2ca64790a048c"} Apr 20 20:06:02.043858 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:02.043655 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-158.ec2.internal" podStartSLOduration=5.043632509 podStartE2EDuration="5.043632509s" podCreationTimestamp="2026-04-20 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:06:02.042660544 +0000 UTC m=+5.592527158" watchObservedRunningTime="2026-04-20 20:06:02.043632509 +0000 UTC m=+5.593499124" Apr 20 20:06:02.611587 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:02.610783 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs\") pod \"network-metrics-daemon-d2g4l\" (UID: \"9f213e16-074a-493b-b57c-f84483b57308\") " pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:02.611587 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:02.610847 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrt8k\" (UniqueName: \"kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k\") pod \"network-check-target-t9khm\" (UID: \"5409aec2-613d-49b4-aad6-5dda25f70168\") " pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:02.611587 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:02.611003 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:02.611587 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:02.611021 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:02.611587 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:02.611035 2573 projected.go:194] Error preparing data for projected volume kube-api-access-wrt8k for pod openshift-network-diagnostics/network-check-target-t9khm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:02.611587 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:02.611092 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k podName:5409aec2-613d-49b4-aad6-5dda25f70168 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:06.611074041 +0000 UTC m=+10.160940647 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wrt8k" (UniqueName: "kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k") pod "network-check-target-t9khm" (UID: "5409aec2-613d-49b4-aad6-5dda25f70168") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:02.611587 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:02.611499 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:02.611587 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:02.611551 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs podName:9f213e16-074a-493b-b57c-f84483b57308 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:06.611536013 +0000 UTC m=+10.161402608 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs") pod "network-metrics-daemon-d2g4l" (UID: "9f213e16-074a-493b-b57c-f84483b57308") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:02.974017 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:02.973426 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:02.974017 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:02.973572 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d2g4l" podUID="9f213e16-074a-493b-b57c-f84483b57308" Apr 20 20:06:02.974349 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:02.974319 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:02.974459 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:02.974426 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9khm" podUID="5409aec2-613d-49b4-aad6-5dda25f70168" Apr 20 20:06:03.417375 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:03.416861 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret\") pod \"global-pull-secret-syncer-wvq2j\" (UID: \"dd310fd8-ff36-47d6-9dbf-d8d029c30747\") " pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:03.417375 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:03.417016 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:03.417375 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:03.417082 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret podName:dd310fd8-ff36-47d6-9dbf-d8d029c30747 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:07.417063793 +0000 UTC m=+10.966930386 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret") pod "global-pull-secret-syncer-wvq2j" (UID: "dd310fd8-ff36-47d6-9dbf-d8d029c30747") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:03.973271 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:03.973233 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:03.973459 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:03.973400 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvq2j" podUID="dd310fd8-ff36-47d6-9dbf-d8d029c30747" Apr 20 20:06:04.972634 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:04.972606 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:04.973081 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:04.972615 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:04.973081 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:04.972754 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9khm" podUID="5409aec2-613d-49b4-aad6-5dda25f70168" Apr 20 20:06:04.973308 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:04.973268 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d2g4l" podUID="9f213e16-074a-493b-b57c-f84483b57308" Apr 20 20:06:05.972988 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:05.972948 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:05.973399 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:05.973072 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvq2j" podUID="dd310fd8-ff36-47d6-9dbf-d8d029c30747" Apr 20 20:06:06.647565 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:06.647518 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrt8k\" (UniqueName: \"kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k\") pod \"network-check-target-t9khm\" (UID: \"5409aec2-613d-49b4-aad6-5dda25f70168\") " pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:06.647756 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:06.647636 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs\") pod \"network-metrics-daemon-d2g4l\" (UID: \"9f213e16-074a-493b-b57c-f84483b57308\") " pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:06.647756 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:06.647688 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:06.647756 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:06.647711 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:06.647756 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:06.647723 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:06.647756 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:06.647725 2573 projected.go:194] Error preparing data for projected volume kube-api-access-wrt8k for pod openshift-network-diagnostics/network-check-target-t9khm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:06.647969 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:06.647775 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs podName:9f213e16-074a-493b-b57c-f84483b57308 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:14.647762303 +0000 UTC m=+18.197628896 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs") pod "network-metrics-daemon-d2g4l" (UID: "9f213e16-074a-493b-b57c-f84483b57308") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:06.647969 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:06.647790 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k podName:5409aec2-613d-49b4-aad6-5dda25f70168 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:14.647783153 +0000 UTC m=+18.197649746 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-wrt8k" (UniqueName: "kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k") pod "network-check-target-t9khm" (UID: "5409aec2-613d-49b4-aad6-5dda25f70168") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:06.974225 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:06.974153 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:06.974629 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:06.974272 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9khm" podUID="5409aec2-613d-49b4-aad6-5dda25f70168" Apr 20 20:06:06.974693 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:06.974679 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:06.974763 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:06.974749 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d2g4l" podUID="9f213e16-074a-493b-b57c-f84483b57308" Apr 20 20:06:07.454246 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:07.454210 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret\") pod \"global-pull-secret-syncer-wvq2j\" (UID: \"dd310fd8-ff36-47d6-9dbf-d8d029c30747\") " pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:07.454449 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:07.454349 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:07.454449 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:07.454420 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret podName:dd310fd8-ff36-47d6-9dbf-d8d029c30747 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:15.454397517 +0000 UTC m=+19.004264132 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret") pod "global-pull-secret-syncer-wvq2j" (UID: "dd310fd8-ff36-47d6-9dbf-d8d029c30747") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:07.973594 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:07.973558 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:07.973763 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:07.973669 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvq2j" podUID="dd310fd8-ff36-47d6-9dbf-d8d029c30747" Apr 20 20:06:08.973559 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:08.973519 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:08.973991 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:08.973526 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:08.973991 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:08.973637 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9khm" podUID="5409aec2-613d-49b4-aad6-5dda25f70168" Apr 20 20:06:08.973991 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:08.973722 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d2g4l" podUID="9f213e16-074a-493b-b57c-f84483b57308" Apr 20 20:06:09.972699 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:09.972656 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:09.972890 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:09.972794 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvq2j" podUID="dd310fd8-ff36-47d6-9dbf-d8d029c30747" Apr 20 20:06:10.973319 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:10.973263 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:10.973831 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:10.973263 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:10.973831 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:10.973418 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d2g4l" podUID="9f213e16-074a-493b-b57c-f84483b57308" Apr 20 20:06:10.973831 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:10.973486 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9khm" podUID="5409aec2-613d-49b4-aad6-5dda25f70168" Apr 20 20:06:11.973032 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:11.973003 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:11.973196 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:11.973116 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvq2j" podUID="dd310fd8-ff36-47d6-9dbf-d8d029c30747" Apr 20 20:06:12.973520 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:12.973481 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:12.973972 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:12.973623 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9khm" podUID="5409aec2-613d-49b4-aad6-5dda25f70168" Apr 20 20:06:12.973972 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:12.973667 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:12.973972 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:12.973771 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d2g4l" podUID="9f213e16-074a-493b-b57c-f84483b57308" Apr 20 20:06:13.973510 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:13.973475 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:13.973665 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:13.973605 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvq2j" podUID="dd310fd8-ff36-47d6-9dbf-d8d029c30747" Apr 20 20:06:14.708824 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:14.708778 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs\") pod \"network-metrics-daemon-d2g4l\" (UID: \"9f213e16-074a-493b-b57c-f84483b57308\") " pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:14.708991 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:14.708850 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrt8k\" (UniqueName: \"kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k\") pod \"network-check-target-t9khm\" (UID: \"5409aec2-613d-49b4-aad6-5dda25f70168\") " pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:14.708991 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:14.708902 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:14.708991 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:14.708976 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs podName:9f213e16-074a-493b-b57c-f84483b57308 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:30.7089538 +0000 UTC m=+34.258820399 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs") pod "network-metrics-daemon-d2g4l" (UID: "9f213e16-074a-493b-b57c-f84483b57308") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:14.709178 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:14.708997 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:14.709178 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:14.709012 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:14.709178 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:14.709024 2573 projected.go:194] Error preparing data for projected volume kube-api-access-wrt8k for pod openshift-network-diagnostics/network-check-target-t9khm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:14.709178 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:14.709066 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k podName:5409aec2-613d-49b4-aad6-5dda25f70168 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:30.709056264 +0000 UTC m=+34.258922856 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-wrt8k" (UniqueName: "kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k") pod "network-check-target-t9khm" (UID: "5409aec2-613d-49b4-aad6-5dda25f70168") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:14.973703 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:14.973616 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:14.973703 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:14.973642 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:14.974187 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:14.973761 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d2g4l" podUID="9f213e16-074a-493b-b57c-f84483b57308" Apr 20 20:06:14.974187 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:14.973887 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9khm" podUID="5409aec2-613d-49b4-aad6-5dda25f70168" Apr 20 20:06:15.516827 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:15.516786 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret\") pod \"global-pull-secret-syncer-wvq2j\" (UID: \"dd310fd8-ff36-47d6-9dbf-d8d029c30747\") " pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:15.516997 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:15.516919 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:15.516997 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:15.516981 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret podName:dd310fd8-ff36-47d6-9dbf-d8d029c30747 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:31.51696822 +0000 UTC m=+35.066834811 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret") pod "global-pull-secret-syncer-wvq2j" (UID: "dd310fd8-ff36-47d6-9dbf-d8d029c30747") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:15.972964 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:15.972930 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:15.973122 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:15.973042 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvq2j" podUID="dd310fd8-ff36-47d6-9dbf-d8d029c30747" Apr 20 20:06:16.974207 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:16.973788 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:16.974207 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:16.973906 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9khm" podUID="5409aec2-613d-49b4-aad6-5dda25f70168" Apr 20 20:06:16.974207 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:16.973960 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:16.974207 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:16.974058 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d2g4l" podUID="9f213e16-074a-493b-b57c-f84483b57308" Apr 20 20:06:17.973108 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:17.972824 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:17.973216 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:17.973184 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvq2j" podUID="dd310fd8-ff36-47d6-9dbf-d8d029c30747" Apr 20 20:06:18.056865 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.056832 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/ovn-acl-logging/0.log" Apr 20 20:06:18.057640 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.057185 2573 generic.go:358] "Generic (PLEG): container finished" podID="c5c20964-6b44-4902-91b4-e2f99aceca2f" containerID="796899dc9ca08c809f51703836cddc2fb76b77ce60ffee84be7ac8cadfb58d7b" exitCode=1 Apr 20 20:06:18.057640 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.057259 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" event={"ID":"c5c20964-6b44-4902-91b4-e2f99aceca2f","Type":"ContainerStarted","Data":"f0759845ca00f2bf031605eea331d3b7b29f0c61216f97b1bf3b6c94316bbd90"} Apr 20 20:06:18.057640 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.057285 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" event={"ID":"c5c20964-6b44-4902-91b4-e2f99aceca2f","Type":"ContainerStarted","Data":"e03394d246ccb2924d78cb3ca1ecee3b195e9433152551ff45ee2a1acb6cc475"} Apr 20 20:06:18.057640 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.057310 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" event={"ID":"c5c20964-6b44-4902-91b4-e2f99aceca2f","Type":"ContainerStarted","Data":"74f0d4c8062679aed5ba1ca356138bb01d1d9430a6129d0cbc754759b0fcdb4f"} Apr 20 20:06:18.057640 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.057320 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" event={"ID":"c5c20964-6b44-4902-91b4-e2f99aceca2f","Type":"ContainerStarted","Data":"524beceeafa4f33e275160bb381bff240f2b52e5a193fdb49c4757fa98cd71c9"} Apr 20 20:06:18.057640 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.057334 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" event={"ID":"c5c20964-6b44-4902-91b4-e2f99aceca2f","Type":"ContainerDied","Data":"796899dc9ca08c809f51703836cddc2fb76b77ce60ffee84be7ac8cadfb58d7b"} Apr 20 20:06:18.057640 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.057347 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" event={"ID":"c5c20964-6b44-4902-91b4-e2f99aceca2f","Type":"ContainerStarted","Data":"0537921b8f407f453fe294ada3da927f81ced11bbdd2a56b14783b5914a15373"} Apr 20 20:06:18.058880 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.058846 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5r4lb" event={"ID":"f3332e18-c5c0-47a4-a5ed-4b719b4bc831","Type":"ContainerStarted","Data":"e799e14b1d22706ef4e9584d99095a63434de93eab43acf0365b6d7d797902b0"} Apr 20 20:06:18.060357 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.060331 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" event={"ID":"cbd70467-a515-4215-9dc3-01d2315f4601","Type":"ContainerStarted","Data":"f4f6c17fa9c2c4b2b92c103fbb10acde2062991ef08c1d9c044ee24b8a02cf9b"} Apr 20 20:06:18.061683 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.061654 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" event={"ID":"7f785e5a-c4aa-40db-a581-8d086c1bf8cb","Type":"ContainerStarted","Data":"c046441439abcfddb6bdf3928efc2fd4d170ffe171118a37b45c9949163ca1f8"} Apr 20 20:06:18.063218 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.063193 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9rvtp" event={"ID":"309654ff-b783-421b-91bd-5ae144783aa3","Type":"ContainerStarted","Data":"bf4d1b8030694f15a17562dd5f490f7e8d081aa9b9c756285f237c7d5bbd112c"} Apr 20 20:06:18.064744 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.064720 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4qgql" event={"ID":"6f6d52d5-73cf-459b-a235-e5cfe1d91c81","Type":"ContainerStarted","Data":"a54d44ee4cea94fb1c39230c72d84c874b09564a3b6f1089583c0760baa7535d"} Apr 20 20:06:18.069651 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.069626 2573 generic.go:358] "Generic (PLEG): container finished" podID="99945c0f-09c6-48fb-84b1-4299b5936bd6" containerID="a9c160dadd3693fb9ecdd3aab33d4e1f64a7c06f7eea4f5af58058e28bd01699" exitCode=0 Apr 20 20:06:18.069739 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.069687 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9zzt" event={"ID":"99945c0f-09c6-48fb-84b1-4299b5936bd6","Type":"ContainerDied","Data":"a9c160dadd3693fb9ecdd3aab33d4e1f64a7c06f7eea4f5af58058e28bd01699"} Apr 20 20:06:18.071653 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.071232 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dt2xp" event={"ID":"aa0188c0-4215-47d0-a910-5a4c74cbc7cc","Type":"ContainerStarted","Data":"7d955e4164b2bb937fdd9388ffa502774074d092225b2cbd404f9004153721f3"} Apr 20 20:06:18.072600 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.072555 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5r4lb" podStartSLOduration=3.689584241 podStartE2EDuration="21.072540892s" podCreationTimestamp="2026-04-20 20:05:57 +0000 UTC" firstStartedPulling="2026-04-20 20:05:59.570837054 +0000 UTC m=+3.120703662" lastFinishedPulling="2026-04-20 20:06:16.953793721 +0000 UTC m=+20.503660313" observedRunningTime="2026-04-20 20:06:18.071778132 +0000 UTC m=+21.621644747" watchObservedRunningTime="2026-04-20 20:06:18.072540892 +0000 UTC m=+21.622407509" Apr 20 20:06:18.099481 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.099441 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-zr8nd" podStartSLOduration=3.74199442 podStartE2EDuration="21.099431716s" podCreationTimestamp="2026-04-20 20:05:57 +0000 UTC" firstStartedPulling="2026-04-20 20:05:59.577325696 +0000 UTC m=+3.127192289" lastFinishedPulling="2026-04-20 20:06:16.934762978 +0000 UTC m=+20.484629585" observedRunningTime="2026-04-20 20:06:18.084192444 +0000 UTC m=+21.634059057" watchObservedRunningTime="2026-04-20 20:06:18.099431716 +0000 UTC m=+21.649298336" Apr 20 20:06:18.117171 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.117130 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-9rvtp" podStartSLOduration=3.752922472 podStartE2EDuration="21.117121266s" podCreationTimestamp="2026-04-20 20:05:57 +0000 UTC" firstStartedPulling="2026-04-20 20:05:59.56900563 +0000 UTC m=+3.118872236" lastFinishedPulling="2026-04-20 20:06:16.933204423 +0000 UTC m=+20.483071030" observedRunningTime="2026-04-20 20:06:18.116873635 +0000 UTC m=+21.666740249" watchObservedRunningTime="2026-04-20 20:06:18.117121266 +0000 UTC m=+21.666987881" Apr 20 20:06:18.135810 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.135771 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4qgql" podStartSLOduration=3.779366004 podStartE2EDuration="21.135763261s" podCreationTimestamp="2026-04-20 20:05:57 +0000 UTC" firstStartedPulling="2026-04-20 20:05:59.58245929 +0000 UTC m=+3.132325882" lastFinishedPulling="2026-04-20 20:06:16.938856534 +0000 UTC m=+20.488723139" observedRunningTime="2026-04-20 20:06:18.135563337 +0000 UTC m=+21.685429952" watchObservedRunningTime="2026-04-20 20:06:18.135763261 +0000 UTC m=+21.685629874" Apr 20 20:06:18.146412 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.146379 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-dt2xp" podStartSLOduration=8.309153751 podStartE2EDuration="21.146370106s" podCreationTimestamp="2026-04-20 20:05:57 +0000 UTC" firstStartedPulling="2026-04-20 20:05:59.581782975 +0000 UTC m=+3.131649568" lastFinishedPulling="2026-04-20 20:06:12.41899933 +0000 UTC m=+15.968865923" observedRunningTime="2026-04-20 20:06:18.14621555 +0000 UTC m=+21.696082163" watchObservedRunningTime="2026-04-20 20:06:18.146370106 +0000 UTC m=+21.696236758" Apr 20 20:06:18.711870 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.711845 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 20:06:18.912977 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.912872 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T20:06:18.711866798Z","UUID":"432ebba6-6c14-4aa7-862f-67086b5c076c","Handler":null,"Name":"","Endpoint":""} Apr 20 20:06:18.915542 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.915506 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 20:06:18.915542 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.915539 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 20:06:18.973023 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.972999 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:18.973171 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:18.973152 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d2g4l" podUID="9f213e16-074a-493b-b57c-f84483b57308" Apr 20 20:06:18.973235 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:18.973162 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:18.973235 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:18.973226 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9khm" podUID="5409aec2-613d-49b4-aad6-5dda25f70168" Apr 20 20:06:19.074848 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:19.074804 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m557q" event={"ID":"01f7cf40-a02f-4e4d-846f-75cdf011fbb1","Type":"ContainerStarted","Data":"0ad88c08c9f4cbff58c74eb67344b19195c1db3965bc9991a77594ba144edb82"} Apr 20 20:06:19.077269 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:19.077236 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" event={"ID":"cbd70467-a515-4215-9dc3-01d2315f4601","Type":"ContainerStarted","Data":"40c1410ff22a28583bc56df6321ddc03313b7c049662927895ae51aa30e4771d"} Apr 20 20:06:19.972678 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:19.972646 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:19.972816 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:19.972768 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvq2j" podUID="dd310fd8-ff36-47d6-9dbf-d8d029c30747" Apr 20 20:06:20.083362 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:20.083332 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/ovn-acl-logging/0.log" Apr 20 20:06:20.083793 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:20.083765 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" event={"ID":"c5c20964-6b44-4902-91b4-e2f99aceca2f","Type":"ContainerStarted","Data":"f972515c1aed7ed449b403a4cf6d6d3f1263f8150cd7bdcd68e3ad13aa08f444"} Apr 20 20:06:20.606567 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:20.606529 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-9rvtp" Apr 20 20:06:20.607201 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:20.607183 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-9rvtp" Apr 20 20:06:20.622127 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:20.622082 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-m557q" podStartSLOduration=6.262638955 podStartE2EDuration="23.622068476s" podCreationTimestamp="2026-04-20 20:05:57 +0000 UTC" firstStartedPulling="2026-04-20 20:05:59.574122321 +0000 UTC m=+3.123988913" lastFinishedPulling="2026-04-20 20:06:16.933551842 +0000 UTC m=+20.483418434" observedRunningTime="2026-04-20 20:06:19.099436819 +0000 UTC m=+22.649303445" watchObservedRunningTime="2026-04-20 20:06:20.622068476 +0000 UTC m=+24.171935091" Apr 20 20:06:20.973484 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:20.973152 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:20.973628 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:20.973541 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d2g4l" podUID="9f213e16-074a-493b-b57c-f84483b57308" Apr 20 20:06:20.973915 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:20.973233 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:20.974040 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:20.973996 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9khm" podUID="5409aec2-613d-49b4-aad6-5dda25f70168" Apr 20 20:06:21.088011 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:21.087979 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" event={"ID":"cbd70467-a515-4215-9dc3-01d2315f4601","Type":"ContainerStarted","Data":"efb5e1396d4081df78235aae73ab1b650dfc0b86c269dce9c18c5a5325e644d3"} Apr 20 20:06:21.088512 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:21.088161 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-9rvtp" Apr 20 20:06:21.088797 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:21.088776 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-9rvtp" Apr 20 20:06:21.105098 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:21.105037 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wj97z" podStartSLOduration=3.452168329 podStartE2EDuration="24.105017789s" podCreationTimestamp="2026-04-20 20:05:57 +0000 UTC" firstStartedPulling="2026-04-20 20:05:59.579486269 +0000 UTC m=+3.129352861" lastFinishedPulling="2026-04-20 20:06:20.23233573 +0000 UTC m=+23.782202321" observedRunningTime="2026-04-20 20:06:21.104774152 +0000 UTC m=+24.654640765" watchObservedRunningTime="2026-04-20 20:06:21.105017789 +0000 UTC m=+24.654884404" Apr 20 20:06:21.972824 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:21.972782 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:21.972985 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:21.972916 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvq2j" podUID="dd310fd8-ff36-47d6-9dbf-d8d029c30747" Apr 20 20:06:22.973315 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:22.973271 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:22.973981 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:22.973276 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:22.973981 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:22.973413 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d2g4l" podUID="9f213e16-074a-493b-b57c-f84483b57308" Apr 20 20:06:22.973981 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:22.973521 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9khm" podUID="5409aec2-613d-49b4-aad6-5dda25f70168" Apr 20 20:06:23.093497 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:23.093466 2573 generic.go:358] "Generic (PLEG): container finished" podID="99945c0f-09c6-48fb-84b1-4299b5936bd6" containerID="0973872f4cf232aa93c31b582203e82eac8b8cab54ce9fb4ea157c2bc744b595" exitCode=0 Apr 20 20:06:23.093663 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:23.093562 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9zzt" event={"ID":"99945c0f-09c6-48fb-84b1-4299b5936bd6","Type":"ContainerDied","Data":"0973872f4cf232aa93c31b582203e82eac8b8cab54ce9fb4ea157c2bc744b595"} Apr 20 20:06:23.096831 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:23.096653 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/ovn-acl-logging/0.log" Apr 20 20:06:23.097166 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:23.097148 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" event={"ID":"c5c20964-6b44-4902-91b4-e2f99aceca2f","Type":"ContainerStarted","Data":"1f207a9106016d1e465690849b03c86073c365a72441a2023efee6a99a5a71bf"} Apr 20 20:06:23.097602 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:23.097578 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:06:23.097683 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:23.097674 2573 scope.go:117] "RemoveContainer" containerID="796899dc9ca08c809f51703836cddc2fb76b77ce60ffee84be7ac8cadfb58d7b" Apr 20 20:06:23.113197 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:23.113178 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:06:23.973566 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:23.973540 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:23.973869 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:23.973670 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvq2j" podUID="dd310fd8-ff36-47d6-9dbf-d8d029c30747" Apr 20 20:06:24.100253 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:24.100223 2573 generic.go:358] "Generic (PLEG): container finished" podID="99945c0f-09c6-48fb-84b1-4299b5936bd6" containerID="330f6eac5a2bd0ee4a192999faaa93e553de3f47f841e1299cb4076294aee1f5" exitCode=0 Apr 20 20:06:24.100400 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:24.100317 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9zzt" event={"ID":"99945c0f-09c6-48fb-84b1-4299b5936bd6","Type":"ContainerDied","Data":"330f6eac5a2bd0ee4a192999faaa93e553de3f47f841e1299cb4076294aee1f5"} Apr 20 20:06:24.103658 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:24.103641 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/ovn-acl-logging/0.log" Apr 20 20:06:24.103996 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:24.103972 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" event={"ID":"c5c20964-6b44-4902-91b4-e2f99aceca2f","Type":"ContainerStarted","Data":"31723c17939cb297cde980e471e2a1457f56c92f222022ad689fc3f0a674b090"} Apr 20 20:06:24.104170 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:24.104152 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:06:24.104262 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:24.104180 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:06:24.118240 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:24.118188 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:06:24.151788 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:24.151744 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" podStartSLOduration=9.57583992 podStartE2EDuration="27.15173321s" podCreationTimestamp="2026-04-20 20:05:57 +0000 UTC" firstStartedPulling="2026-04-20 20:05:59.578035008 +0000 UTC m=+3.127901616" lastFinishedPulling="2026-04-20 20:06:17.1539283 +0000 UTC m=+20.703794906" observedRunningTime="2026-04-20 20:06:24.151210111 +0000 UTC m=+27.701076726" watchObservedRunningTime="2026-04-20 20:06:24.15173321 +0000 UTC m=+27.701599824" Apr 20 20:06:24.244695 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:24.244659 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d2g4l"] Apr 20 20:06:24.244865 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:24.244789 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:24.244904 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:24.244886 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d2g4l" podUID="9f213e16-074a-493b-b57c-f84483b57308" Apr 20 20:06:24.247431 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:24.247408 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t9khm"] Apr 20 20:06:24.247540 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:24.247492 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:24.247579 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:24.247554 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9khm" podUID="5409aec2-613d-49b4-aad6-5dda25f70168" Apr 20 20:06:24.254597 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:24.254574 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wvq2j"] Apr 20 20:06:24.254694 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:24.254668 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:24.254745 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:24.254734 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvq2j" podUID="dd310fd8-ff36-47d6-9dbf-d8d029c30747" Apr 20 20:06:25.108980 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:25.108952 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9zzt" event={"ID":"99945c0f-09c6-48fb-84b1-4299b5936bd6","Type":"ContainerStarted","Data":"ea6b899086c793a11ba34c9b2d1d78f6e8d5c19112f5da539286daad5f14e2d5"} Apr 20 20:06:25.973521 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:25.973476 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:25.973744 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:25.973583 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d2g4l" podUID="9f213e16-074a-493b-b57c-f84483b57308" Apr 20 20:06:25.973744 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:25.973603 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:25.973744 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:25.973618 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:25.973744 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:25.973713 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvq2j" podUID="dd310fd8-ff36-47d6-9dbf-d8d029c30747" Apr 20 20:06:25.974048 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:25.973801 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9khm" podUID="5409aec2-613d-49b4-aad6-5dda25f70168" Apr 20 20:06:26.113538 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:26.113508 2573 generic.go:358] "Generic (PLEG): container finished" podID="99945c0f-09c6-48fb-84b1-4299b5936bd6" containerID="ea6b899086c793a11ba34c9b2d1d78f6e8d5c19112f5da539286daad5f14e2d5" exitCode=0 Apr 20 20:06:26.113975 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:26.113599 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9zzt" event={"ID":"99945c0f-09c6-48fb-84b1-4299b5936bd6","Type":"ContainerDied","Data":"ea6b899086c793a11ba34c9b2d1d78f6e8d5c19112f5da539286daad5f14e2d5"} Apr 20 20:06:27.972691 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:27.972649 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:27.972691 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:27.972669 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:27.972691 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:27.972691 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:27.973082 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:27.972792 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvq2j" podUID="dd310fd8-ff36-47d6-9dbf-d8d029c30747" Apr 20 20:06:27.973250 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:27.973230 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9khm" podUID="5409aec2-613d-49b4-aad6-5dda25f70168" Apr 20 20:06:27.973366 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:27.973346 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d2g4l" podUID="9f213e16-074a-493b-b57c-f84483b57308" Apr 20 20:06:29.973765 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:29.973548 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:29.974219 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:29.973611 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:29.974219 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:29.973864 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t9khm" podUID="5409aec2-613d-49b4-aad6-5dda25f70168" Apr 20 20:06:29.974219 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:29.973958 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d2g4l" podUID="9f213e16-074a-493b-b57c-f84483b57308" Apr 20 20:06:29.974219 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:29.973628 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:29.974219 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:29.974062 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvq2j" podUID="dd310fd8-ff36-47d6-9dbf-d8d029c30747" Apr 20 20:06:30.253716 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.253686 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-158.ec2.internal" event="NodeReady" Apr 20 20:06:30.253906 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.253848 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 20:06:30.287096 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.287061 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5957d944c5-6fq5k"] Apr 20 20:06:30.291145 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.291108 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.294372 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.294341 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 20:06:30.294537 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.294387 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 20:06:30.294537 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.294341 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dkw75\"" Apr 20 20:06:30.294663 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.294649 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 20:06:30.299226 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.299195 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 20:06:30.300729 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.300689 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5957d944c5-6fq5k"] Apr 20 20:06:30.301647 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.301413 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7ssck"] Apr 20 20:06:30.304168 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.304149 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-566sx"] Apr 20 20:06:30.304364 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.304347 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7ssck" Apr 20 20:06:30.307340 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.306871 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 20:06:30.307340 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.306949 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 20:06:30.307340 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.307179 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vhk8x\"" Apr 20 20:06:30.308220 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.308203 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-566sx" Apr 20 20:06:30.310228 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.310207 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 20:06:30.310365 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.310259 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 20:06:30.310365 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.310344 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xwtbs\"" Apr 20 20:06:30.310569 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.310554 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 20:06:30.317322 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.317027 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7ssck"] Apr 20 20:06:30.317620 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.317559 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-566sx"] Apr 20 20:06:30.426777 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.426738 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-certificates\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.426978 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.426844 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-ca-trust-extracted\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.426978 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.426879 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l4v9\" (UniqueName: \"kubernetes.io/projected/a5f9fd3a-20c3-49e2-860d-0b343b78d891-kube-api-access-8l4v9\") pod \"ingress-canary-566sx\" (UID: \"a5f9fd3a-20c3-49e2-860d-0b343b78d891\") " pod="openshift-ingress-canary/ingress-canary-566sx" Apr 20 20:06:30.426978 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.426929 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.426978 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.426952 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-trusted-ca\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.427150 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.427046 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-installation-pull-secrets\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.427150 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.427121 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-image-registry-private-configuration\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.427234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.427150 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:06:30.427234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.427174 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/820f3779-0686-4cab-81ea-d64fa84a9bde-tmp-dir\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:06:30.427234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.427196 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert\") pod \"ingress-canary-566sx\" (UID: \"a5f9fd3a-20c3-49e2-860d-0b343b78d891\") " pod="openshift-ingress-canary/ingress-canary-566sx" Apr 20 20:06:30.427410 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.427266 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-bound-sa-token\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.427410 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.427332 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwm2l\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-kube-api-access-gwm2l\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.427410 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.427362 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/820f3779-0686-4cab-81ea-d64fa84a9bde-config-volume\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:06:30.427410 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.427388 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2pz5\" (UniqueName: \"kubernetes.io/projected/820f3779-0686-4cab-81ea-d64fa84a9bde-kube-api-access-m2pz5\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:06:30.528107 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.528014 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/820f3779-0686-4cab-81ea-d64fa84a9bde-config-volume\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:06:30.528107 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.528067 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2pz5\" (UniqueName: \"kubernetes.io/projected/820f3779-0686-4cab-81ea-d64fa84a9bde-kube-api-access-m2pz5\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:06:30.528107 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.528104 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-certificates\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.528395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.528153 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-ca-trust-extracted\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.528395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.528179 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8l4v9\" (UniqueName: \"kubernetes.io/projected/a5f9fd3a-20c3-49e2-860d-0b343b78d891-kube-api-access-8l4v9\") pod \"ingress-canary-566sx\" (UID: \"a5f9fd3a-20c3-49e2-860d-0b343b78d891\") " pod="openshift-ingress-canary/ingress-canary-566sx" Apr 20 20:06:30.528395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.528229 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.528395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.528250 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-trusted-ca\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.528395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.528278 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-installation-pull-secrets\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.528395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.528341 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-image-registry-private-configuration\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.528395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.528367 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:06:30.528395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.528383 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/820f3779-0686-4cab-81ea-d64fa84a9bde-tmp-dir\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:06:30.528395 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.528397 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert\") pod \"ingress-canary-566sx\" (UID: \"a5f9fd3a-20c3-49e2-860d-0b343b78d891\") " pod="openshift-ingress-canary/ingress-canary-566sx" Apr 20 20:06:30.528821 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.528417 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-bound-sa-token\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.528821 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.528440 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwm2l\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-kube-api-access-gwm2l\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.528821 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:30.528577 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:30.528821 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:30.528643 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert podName:a5f9fd3a-20c3-49e2-860d-0b343b78d891 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:31.028624487 +0000 UTC m=+34.578491093 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert") pod "ingress-canary-566sx" (UID: "a5f9fd3a-20c3-49e2-860d-0b343b78d891") : secret "canary-serving-cert" not found Apr 20 20:06:30.528821 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.528695 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/820f3779-0686-4cab-81ea-d64fa84a9bde-config-volume\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:06:30.528821 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:30.528582 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:06:30.528821 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:30.528752 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5957d944c5-6fq5k: secret "image-registry-tls" not found Apr 20 20:06:30.528821 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:30.528774 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:30.528821 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:30.528805 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls podName:f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:31.028787908 +0000 UTC m=+34.578654500 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls") pod "image-registry-5957d944c5-6fq5k" (UID: "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4") : secret "image-registry-tls" not found Apr 20 20:06:30.528821 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:30.528824 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls podName:820f3779-0686-4cab-81ea-d64fa84a9bde nodeName:}" failed. No retries permitted until 2026-04-20 20:06:31.028814896 +0000 UTC m=+34.578681491 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls") pod "dns-default-7ssck" (UID: "820f3779-0686-4cab-81ea-d64fa84a9bde") : secret "dns-default-metrics-tls" not found Apr 20 20:06:30.529364 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.528841 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/820f3779-0686-4cab-81ea-d64fa84a9bde-tmp-dir\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:06:30.529364 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.529352 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-certificates\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.529496 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.529480 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-ca-trust-extracted\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.530051 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.530014 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-trusted-ca\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.534009 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.533984 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-installation-pull-secrets\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.534009 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.534002 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-image-registry-private-configuration\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.538933 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.538896 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2pz5\" (UniqueName: \"kubernetes.io/projected/820f3779-0686-4cab-81ea-d64fa84a9bde-kube-api-access-m2pz5\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:06:30.539621 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.539599 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-bound-sa-token\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.543988 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.543966 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwm2l\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-kube-api-access-gwm2l\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:30.544097 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.544000 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l4v9\" (UniqueName: \"kubernetes.io/projected/a5f9fd3a-20c3-49e2-860d-0b343b78d891-kube-api-access-8l4v9\") pod \"ingress-canary-566sx\" (UID: \"a5f9fd3a-20c3-49e2-860d-0b343b78d891\") " pod="openshift-ingress-canary/ingress-canary-566sx" Apr 20 20:06:30.729622 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.729585 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs\") pod \"network-metrics-daemon-d2g4l\" (UID: \"9f213e16-074a-493b-b57c-f84483b57308\") " pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:30.729808 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:30.729660 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrt8k\" (UniqueName: \"kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k\") pod \"network-check-target-t9khm\" (UID: \"5409aec2-613d-49b4-aad6-5dda25f70168\") " pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:30.729808 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:30.729764 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:30.729923 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:30.729846 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:30.729923 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:30.729870 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:30.729923 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:30.729883 2573 projected.go:194] Error preparing data for projected volume kube-api-access-wrt8k for pod openshift-network-diagnostics/network-check-target-t9khm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:30.729923 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:30.729852 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs podName:9f213e16-074a-493b-b57c-f84483b57308 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:02.729828671 +0000 UTC m=+66.279695265 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs") pod "network-metrics-daemon-d2g4l" (UID: "9f213e16-074a-493b-b57c-f84483b57308") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:30.730071 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:30.729947 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k podName:5409aec2-613d-49b4-aad6-5dda25f70168 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:02.729931062 +0000 UTC m=+66.279797660 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-wrt8k" (UniqueName: "kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k") pod "network-check-target-t9khm" (UID: "5409aec2-613d-49b4-aad6-5dda25f70168") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:31.031873 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.031822 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:31.032435 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.031926 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:06:31.032435 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.031950 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert\") pod \"ingress-canary-566sx\" (UID: \"a5f9fd3a-20c3-49e2-860d-0b343b78d891\") " pod="openshift-ingress-canary/ingress-canary-566sx" Apr 20 20:06:31.032435 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:31.032003 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:06:31.032435 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:31.032030 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5957d944c5-6fq5k: secret "image-registry-tls" not found Apr 20 20:06:31.032435 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:31.032070 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:31.032435 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:31.032105 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:31.032435 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:31.032116 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls podName:f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:32.032097829 +0000 UTC m=+35.581964440 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls") pod "image-registry-5957d944c5-6fq5k" (UID: "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4") : secret "image-registry-tls" not found Apr 20 20:06:31.032435 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:31.032185 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert podName:a5f9fd3a-20c3-49e2-860d-0b343b78d891 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:32.032166147 +0000 UTC m=+35.582032744 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert") pod "ingress-canary-566sx" (UID: "a5f9fd3a-20c3-49e2-860d-0b343b78d891") : secret "canary-serving-cert" not found Apr 20 20:06:31.032435 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:31.032205 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls podName:820f3779-0686-4cab-81ea-d64fa84a9bde nodeName:}" failed. No retries permitted until 2026-04-20 20:06:32.032195269 +0000 UTC m=+35.582061868 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls") pod "dns-default-7ssck" (UID: "820f3779-0686-4cab-81ea-d64fa84a9bde") : secret "dns-default-metrics-tls" not found Apr 20 20:06:31.536880 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.536821 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret\") pod \"global-pull-secret-syncer-wvq2j\" (UID: \"dd310fd8-ff36-47d6-9dbf-d8d029c30747\") " pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:31.537066 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:31.536973 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:31.537066 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:31.537049 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret podName:dd310fd8-ff36-47d6-9dbf-d8d029c30747 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:03.53703293 +0000 UTC m=+67.086899546 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret") pod "global-pull-secret-syncer-wvq2j" (UID: "dd310fd8-ff36-47d6-9dbf-d8d029c30747") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:06:31.864891 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.864853 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fc69b6cb5-nszp8"] Apr 20 20:06:31.896470 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.896440 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n"] Apr 20 20:06:31.896642 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.896610 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fc69b6cb5-nszp8" Apr 20 20:06:31.901023 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.900990 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 20:06:31.901208 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.901031 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 20:06:31.901208 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.901094 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 20:06:31.901622 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.901608 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-gz96b\"" Apr 20 20:06:31.901699 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.901629 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 20:06:31.911461 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.911424 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb"] Apr 20 20:06:31.911612 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.911496 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" Apr 20 20:06:31.914041 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.914015 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 20:06:31.937736 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.937700 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fc69b6cb5-nszp8"] Apr 20 20:06:31.937736 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.937730 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n"] Apr 20 20:06:31.937736 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.937743 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb"] Apr 20 20:06:31.937955 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.937840 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:31.938958 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.938934 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d6fb0dcc-9502-494c-86b3-da5eaae6b213-klusterlet-config\") pod \"klusterlet-addon-workmgr-9b7cbdf44-wdk2n\" (UID: \"d6fb0dcc-9502-494c-86b3-da5eaae6b213\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" Apr 20 20:06:31.939075 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.939015 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxf5g\" (UniqueName: \"kubernetes.io/projected/0dea69e5-e005-480f-b6f4-45b6319564ed-kube-api-access-bxf5g\") pod \"managed-serviceaccount-addon-agent-fc69b6cb5-nszp8\" (UID: \"0dea69e5-e005-480f-b6f4-45b6319564ed\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fc69b6cb5-nszp8" Apr 20 20:06:31.939075 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.939050 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0dea69e5-e005-480f-b6f4-45b6319564ed-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-fc69b6cb5-nszp8\" (UID: \"0dea69e5-e005-480f-b6f4-45b6319564ed\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fc69b6cb5-nszp8" Apr 20 20:06:31.939180 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.939109 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d6fb0dcc-9502-494c-86b3-da5eaae6b213-tmp\") pod \"klusterlet-addon-workmgr-9b7cbdf44-wdk2n\" (UID: \"d6fb0dcc-9502-494c-86b3-da5eaae6b213\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" Apr 20 20:06:31.939180 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.939147 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb4lr\" (UniqueName: \"kubernetes.io/projected/d6fb0dcc-9502-494c-86b3-da5eaae6b213-kube-api-access-qb4lr\") pod \"klusterlet-addon-workmgr-9b7cbdf44-wdk2n\" (UID: \"d6fb0dcc-9502-494c-86b3-da5eaae6b213\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" Apr 20 20:06:31.940752 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.940729 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 20:06:31.940823 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.940808 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 20:06:31.940871 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.940829 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 20:06:31.940871 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.940836 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 20:06:31.973604 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.973568 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:06:31.973778 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.973568 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:06:31.973842 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.973568 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:06:31.976358 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.976337 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 20:06:31.976358 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.976357 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:06:31.976572 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.976424 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:06:31.976572 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.976484 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4tpsm\"" Apr 20 20:06:31.976809 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.976789 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-9pkw7\"" Apr 20 20:06:31.976908 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:31.976798 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:06:32.040514 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.040477 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qb4lr\" (UniqueName: \"kubernetes.io/projected/d6fb0dcc-9502-494c-86b3-da5eaae6b213-kube-api-access-qb4lr\") pod \"klusterlet-addon-workmgr-9b7cbdf44-wdk2n\" (UID: \"d6fb0dcc-9502-494c-86b3-da5eaae6b213\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" Apr 20 20:06:32.040988 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.040530 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:32.040988 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.040569 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d6fb0dcc-9502-494c-86b3-da5eaae6b213-klusterlet-config\") pod \"klusterlet-addon-workmgr-9b7cbdf44-wdk2n\" (UID: \"d6fb0dcc-9502-494c-86b3-da5eaae6b213\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" Apr 20 20:06:32.040988 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.040589 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/48f66300-968d-4c37-a0eb-56fbbf9831fb-hub\") pod \"cluster-proxy-proxy-agent-66f999bc4c-g2qtb\" (UID: \"48f66300-968d-4c37-a0eb-56fbbf9831fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.040988 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.040613 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:06:32.040988 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.040668 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert\") pod \"ingress-canary-566sx\" (UID: \"a5f9fd3a-20c3-49e2-860d-0b343b78d891\") " pod="openshift-ingress-canary/ingress-canary-566sx" Apr 20 20:06:32.040988 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:32.040709 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:32.040988 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.040728 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxf5g\" (UniqueName: \"kubernetes.io/projected/0dea69e5-e005-480f-b6f4-45b6319564ed-kube-api-access-bxf5g\") pod \"managed-serviceaccount-addon-agent-fc69b6cb5-nszp8\" (UID: \"0dea69e5-e005-480f-b6f4-45b6319564ed\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fc69b6cb5-nszp8" Apr 20 20:06:32.040988 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:32.040759 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls podName:820f3779-0686-4cab-81ea-d64fa84a9bde nodeName:}" failed. No retries permitted until 2026-04-20 20:06:34.040741211 +0000 UTC m=+37.590607804 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls") pod "dns-default-7ssck" (UID: "820f3779-0686-4cab-81ea-d64fa84a9bde") : secret "dns-default-metrics-tls" not found Apr 20 20:06:32.040988 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.040774 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/48f66300-968d-4c37-a0eb-56fbbf9831fb-ca\") pod \"cluster-proxy-proxy-agent-66f999bc4c-g2qtb\" (UID: \"48f66300-968d-4c37-a0eb-56fbbf9831fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.040988 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:32.040782 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:06:32.040988 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.040795 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/48f66300-968d-4c37-a0eb-56fbbf9831fb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-66f999bc4c-g2qtb\" (UID: \"48f66300-968d-4c37-a0eb-56fbbf9831fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.040988 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:32.040801 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5957d944c5-6fq5k: secret "image-registry-tls" not found Apr 20 20:06:32.040988 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.040825 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0dea69e5-e005-480f-b6f4-45b6319564ed-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-fc69b6cb5-nszp8\" (UID: \"0dea69e5-e005-480f-b6f4-45b6319564ed\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fc69b6cb5-nszp8" Apr 20 20:06:32.040988 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.040849 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/48f66300-968d-4c37-a0eb-56fbbf9831fb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-66f999bc4c-g2qtb\" (UID: \"48f66300-968d-4c37-a0eb-56fbbf9831fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.040988 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:32.040861 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls podName:f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:34.040844132 +0000 UTC m=+37.590710738 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls") pod "image-registry-5957d944c5-6fq5k" (UID: "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4") : secret "image-registry-tls" not found Apr 20 20:06:32.040988 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:32.040923 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:32.041635 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:32.040975 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert podName:a5f9fd3a-20c3-49e2-860d-0b343b78d891 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:34.040963499 +0000 UTC m=+37.590830105 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert") pod "ingress-canary-566sx" (UID: "a5f9fd3a-20c3-49e2-860d-0b343b78d891") : secret "canary-serving-cert" not found Apr 20 20:06:32.041635 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.040997 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/48f66300-968d-4c37-a0eb-56fbbf9831fb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-66f999bc4c-g2qtb\" (UID: \"48f66300-968d-4c37-a0eb-56fbbf9831fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.041635 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.041056 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d6fb0dcc-9502-494c-86b3-da5eaae6b213-tmp\") pod \"klusterlet-addon-workmgr-9b7cbdf44-wdk2n\" (UID: \"d6fb0dcc-9502-494c-86b3-da5eaae6b213\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" Apr 20 20:06:32.041635 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.041089 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrm2h\" (UniqueName: \"kubernetes.io/projected/48f66300-968d-4c37-a0eb-56fbbf9831fb-kube-api-access-zrm2h\") pod \"cluster-proxy-proxy-agent-66f999bc4c-g2qtb\" (UID: \"48f66300-968d-4c37-a0eb-56fbbf9831fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.041635 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.041414 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d6fb0dcc-9502-494c-86b3-da5eaae6b213-tmp\") pod \"klusterlet-addon-workmgr-9b7cbdf44-wdk2n\" (UID: \"d6fb0dcc-9502-494c-86b3-da5eaae6b213\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" Apr 20 20:06:32.043427 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.043408 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d6fb0dcc-9502-494c-86b3-da5eaae6b213-klusterlet-config\") pod \"klusterlet-addon-workmgr-9b7cbdf44-wdk2n\" (UID: \"d6fb0dcc-9502-494c-86b3-da5eaae6b213\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" Apr 20 20:06:32.049277 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.049251 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb4lr\" (UniqueName: \"kubernetes.io/projected/d6fb0dcc-9502-494c-86b3-da5eaae6b213-kube-api-access-qb4lr\") pod \"klusterlet-addon-workmgr-9b7cbdf44-wdk2n\" (UID: \"d6fb0dcc-9502-494c-86b3-da5eaae6b213\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" Apr 20 20:06:32.054705 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.054676 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0dea69e5-e005-480f-b6f4-45b6319564ed-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-fc69b6cb5-nszp8\" (UID: \"0dea69e5-e005-480f-b6f4-45b6319564ed\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fc69b6cb5-nszp8" Apr 20 20:06:32.056731 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.056710 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxf5g\" (UniqueName: \"kubernetes.io/projected/0dea69e5-e005-480f-b6f4-45b6319564ed-kube-api-access-bxf5g\") pod \"managed-serviceaccount-addon-agent-fc69b6cb5-nszp8\" (UID: \"0dea69e5-e005-480f-b6f4-45b6319564ed\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fc69b6cb5-nszp8" Apr 20 20:06:32.142544 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.142445 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/48f66300-968d-4c37-a0eb-56fbbf9831fb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-66f999bc4c-g2qtb\" (UID: \"48f66300-968d-4c37-a0eb-56fbbf9831fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.142544 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.142495 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/48f66300-968d-4c37-a0eb-56fbbf9831fb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-66f999bc4c-g2qtb\" (UID: \"48f66300-968d-4c37-a0eb-56fbbf9831fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.142544 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.142524 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/48f66300-968d-4c37-a0eb-56fbbf9831fb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-66f999bc4c-g2qtb\" (UID: \"48f66300-968d-4c37-a0eb-56fbbf9831fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.142766 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.142637 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrm2h\" (UniqueName: \"kubernetes.io/projected/48f66300-968d-4c37-a0eb-56fbbf9831fb-kube-api-access-zrm2h\") pod \"cluster-proxy-proxy-agent-66f999bc4c-g2qtb\" (UID: \"48f66300-968d-4c37-a0eb-56fbbf9831fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.142766 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.142748 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/48f66300-968d-4c37-a0eb-56fbbf9831fb-hub\") pod \"cluster-proxy-proxy-agent-66f999bc4c-g2qtb\" (UID: \"48f66300-968d-4c37-a0eb-56fbbf9831fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.142842 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.142802 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/48f66300-968d-4c37-a0eb-56fbbf9831fb-ca\") pod \"cluster-proxy-proxy-agent-66f999bc4c-g2qtb\" (UID: \"48f66300-968d-4c37-a0eb-56fbbf9831fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.143314 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.143234 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/48f66300-968d-4c37-a0eb-56fbbf9831fb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-66f999bc4c-g2qtb\" (UID: \"48f66300-968d-4c37-a0eb-56fbbf9831fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.145279 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.145250 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/48f66300-968d-4c37-a0eb-56fbbf9831fb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-66f999bc4c-g2qtb\" (UID: \"48f66300-968d-4c37-a0eb-56fbbf9831fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.145404 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.145252 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/48f66300-968d-4c37-a0eb-56fbbf9831fb-ca\") pod \"cluster-proxy-proxy-agent-66f999bc4c-g2qtb\" (UID: \"48f66300-968d-4c37-a0eb-56fbbf9831fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.145404 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.145337 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/48f66300-968d-4c37-a0eb-56fbbf9831fb-hub\") pod \"cluster-proxy-proxy-agent-66f999bc4c-g2qtb\" (UID: \"48f66300-968d-4c37-a0eb-56fbbf9831fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.145404 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.145401 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/48f66300-968d-4c37-a0eb-56fbbf9831fb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-66f999bc4c-g2qtb\" (UID: \"48f66300-968d-4c37-a0eb-56fbbf9831fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.151187 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.151161 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrm2h\" (UniqueName: \"kubernetes.io/projected/48f66300-968d-4c37-a0eb-56fbbf9831fb-kube-api-access-zrm2h\") pod \"cluster-proxy-proxy-agent-66f999bc4c-g2qtb\" (UID: \"48f66300-968d-4c37-a0eb-56fbbf9831fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.223188 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.223148 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fc69b6cb5-nszp8" Apr 20 20:06:32.231182 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.231146 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" Apr 20 20:06:32.247581 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.247548 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:06:32.431374 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.431087 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb"] Apr 20 20:06:32.447866 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.447832 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fc69b6cb5-nszp8"] Apr 20 20:06:32.453909 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:32.453874 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n"] Apr 20 20:06:32.457089 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:06:32.457059 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48f66300_968d_4c37_a0eb_56fbbf9831fb.slice/crio-6821c94fda2d840e649d23ec2addef3b57d870fa442bbed49b3e48dfa53356f4 WatchSource:0}: Error finding container 6821c94fda2d840e649d23ec2addef3b57d870fa442bbed49b3e48dfa53356f4: Status 404 returned error can't find the container with id 6821c94fda2d840e649d23ec2addef3b57d870fa442bbed49b3e48dfa53356f4 Apr 20 20:06:32.458126 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:06:32.458106 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dea69e5_e005_480f_b6f4_45b6319564ed.slice/crio-d7601f3bc99acb8d7726dc2fc7cd2c63633b23123cb9c0e9c605c85967fbd415 WatchSource:0}: Error finding container d7601f3bc99acb8d7726dc2fc7cd2c63633b23123cb9c0e9c605c85967fbd415: Status 404 returned error can't find the container with id d7601f3bc99acb8d7726dc2fc7cd2c63633b23123cb9c0e9c605c85967fbd415 Apr 20 20:06:32.458586 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:06:32.458557 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6fb0dcc_9502_494c_86b3_da5eaae6b213.slice/crio-e624aceb58eb2b6033880791d704a2178f417cb10e7f2501c56d00a7a0f82881 WatchSource:0}: Error finding container e624aceb58eb2b6033880791d704a2178f417cb10e7f2501c56d00a7a0f82881: Status 404 returned error can't find the container with id e624aceb58eb2b6033880791d704a2178f417cb10e7f2501c56d00a7a0f82881 Apr 20 20:06:33.128159 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:33.128093 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" event={"ID":"48f66300-968d-4c37-a0eb-56fbbf9831fb","Type":"ContainerStarted","Data":"6821c94fda2d840e649d23ec2addef3b57d870fa442bbed49b3e48dfa53356f4"} Apr 20 20:06:33.129424 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:33.129377 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" event={"ID":"d6fb0dcc-9502-494c-86b3-da5eaae6b213","Type":"ContainerStarted","Data":"e624aceb58eb2b6033880791d704a2178f417cb10e7f2501c56d00a7a0f82881"} Apr 20 20:06:33.130797 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:33.130757 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fc69b6cb5-nszp8" event={"ID":"0dea69e5-e005-480f-b6f4-45b6319564ed","Type":"ContainerStarted","Data":"d7601f3bc99acb8d7726dc2fc7cd2c63633b23123cb9c0e9c605c85967fbd415"} Apr 20 20:06:33.135020 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:33.134922 2573 generic.go:358] "Generic (PLEG): container finished" podID="99945c0f-09c6-48fb-84b1-4299b5936bd6" containerID="9584b6f2c73120de0dab116c1212c9d5d3058789c83c4b1bf1e2234dd56ed1fe" exitCode=0 Apr 20 20:06:33.135020 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:33.134987 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9zzt" event={"ID":"99945c0f-09c6-48fb-84b1-4299b5936bd6","Type":"ContainerDied","Data":"9584b6f2c73120de0dab116c1212c9d5d3058789c83c4b1bf1e2234dd56ed1fe"} Apr 20 20:06:34.060571 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:34.060481 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:34.060738 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:34.060580 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:06:34.060738 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:34.060609 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert\") pod \"ingress-canary-566sx\" (UID: \"a5f9fd3a-20c3-49e2-860d-0b343b78d891\") " pod="openshift-ingress-canary/ingress-canary-566sx" Apr 20 20:06:34.060738 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:34.060658 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:06:34.060738 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:34.060690 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5957d944c5-6fq5k: secret "image-registry-tls" not found Apr 20 20:06:34.060955 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:34.060754 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls podName:f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:38.060734183 +0000 UTC m=+41.610600778 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls") pod "image-registry-5957d944c5-6fq5k" (UID: "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4") : secret "image-registry-tls" not found Apr 20 20:06:34.060955 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:34.060764 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:34.060955 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:34.060817 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert podName:a5f9fd3a-20c3-49e2-860d-0b343b78d891 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:38.060800651 +0000 UTC m=+41.610667262 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert") pod "ingress-canary-566sx" (UID: "a5f9fd3a-20c3-49e2-860d-0b343b78d891") : secret "canary-serving-cert" not found Apr 20 20:06:34.060955 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:34.060888 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:34.060955 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:34.060921 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls podName:820f3779-0686-4cab-81ea-d64fa84a9bde nodeName:}" failed. No retries permitted until 2026-04-20 20:06:38.06091052 +0000 UTC m=+41.610777148 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls") pod "dns-default-7ssck" (UID: "820f3779-0686-4cab-81ea-d64fa84a9bde") : secret "dns-default-metrics-tls" not found Apr 20 20:06:34.142265 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:34.142224 2573 generic.go:358] "Generic (PLEG): container finished" podID="99945c0f-09c6-48fb-84b1-4299b5936bd6" containerID="548e256c1c673870deaaeb997106f3dbab877935ddd911b853f3eb7d1b3789c2" exitCode=0 Apr 20 20:06:34.143197 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:34.143163 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9zzt" event={"ID":"99945c0f-09c6-48fb-84b1-4299b5936bd6","Type":"ContainerDied","Data":"548e256c1c673870deaaeb997106f3dbab877935ddd911b853f3eb7d1b3789c2"} Apr 20 20:06:35.148978 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:35.148409 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9zzt" event={"ID":"99945c0f-09c6-48fb-84b1-4299b5936bd6","Type":"ContainerStarted","Data":"0cddb50bc7534ba6a5fd50231058642691e47cc2d64fff76600282bb858e68f7"} Apr 20 20:06:35.175491 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:35.174993 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-f9zzt" podStartSLOduration=5.583297308 podStartE2EDuration="38.17497212s" podCreationTimestamp="2026-04-20 20:05:57 +0000 UTC" firstStartedPulling="2026-04-20 20:05:59.581851831 +0000 UTC m=+3.131718438" lastFinishedPulling="2026-04-20 20:06:32.173526651 +0000 UTC m=+35.723393250" observedRunningTime="2026-04-20 20:06:35.173325026 +0000 UTC m=+38.723191641" watchObservedRunningTime="2026-04-20 20:06:35.17497212 +0000 UTC m=+38.724838737" Apr 20 20:06:38.097527 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:38.097486 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:06:38.097949 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:38.097561 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert\") pod \"ingress-canary-566sx\" (UID: \"a5f9fd3a-20c3-49e2-860d-0b343b78d891\") " pod="openshift-ingress-canary/ingress-canary-566sx" Apr 20 20:06:38.097949 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:38.097631 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:38.097949 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:38.097648 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:38.097949 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:38.097705 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:38.097949 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:38.097732 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls podName:820f3779-0686-4cab-81ea-d64fa84a9bde nodeName:}" failed. No retries permitted until 2026-04-20 20:06:46.097710927 +0000 UTC m=+49.647577539 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls") pod "dns-default-7ssck" (UID: "820f3779-0686-4cab-81ea-d64fa84a9bde") : secret "dns-default-metrics-tls" not found Apr 20 20:06:38.097949 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:38.097743 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:06:38.097949 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:38.097754 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert podName:a5f9fd3a-20c3-49e2-860d-0b343b78d891 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:46.097740907 +0000 UTC m=+49.647607499 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert") pod "ingress-canary-566sx" (UID: "a5f9fd3a-20c3-49e2-860d-0b343b78d891") : secret "canary-serving-cert" not found Apr 20 20:06:38.097949 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:38.097758 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5957d944c5-6fq5k: secret "image-registry-tls" not found Apr 20 20:06:38.097949 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:38.097803 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls podName:f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:46.097791818 +0000 UTC m=+49.647658414 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls") pod "image-registry-5957d944c5-6fq5k" (UID: "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4") : secret "image-registry-tls" not found Apr 20 20:06:39.158334 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:39.158275 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" event={"ID":"48f66300-968d-4c37-a0eb-56fbbf9831fb","Type":"ContainerStarted","Data":"98ad0453b667fb4db4968f42ca301da503d65edbd806f05a1cffd0f450acf5ec"} Apr 20 20:06:39.159691 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:39.159659 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" event={"ID":"d6fb0dcc-9502-494c-86b3-da5eaae6b213","Type":"ContainerStarted","Data":"592a7d9ddb2d51faaf7db698d33e59e92c9790331629d67d3ce9f6b640c8497d"} Apr 20 20:06:39.159896 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:39.159868 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" Apr 20 20:06:39.161008 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:39.160984 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fc69b6cb5-nszp8" event={"ID":"0dea69e5-e005-480f-b6f4-45b6319564ed","Type":"ContainerStarted","Data":"9a7fae2aab48fc8887c33e28933967fe928977b7592e7abc54d7498244a83556"} Apr 20 20:06:39.161754 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:39.161733 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" Apr 20 20:06:39.174182 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:39.174141 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" podStartSLOduration=1.857387534 podStartE2EDuration="8.174126129s" podCreationTimestamp="2026-04-20 20:06:31 +0000 UTC" firstStartedPulling="2026-04-20 20:06:32.460244512 +0000 UTC m=+36.010111104" lastFinishedPulling="2026-04-20 20:06:38.776983105 +0000 UTC m=+42.326849699" observedRunningTime="2026-04-20 20:06:39.173895177 +0000 UTC m=+42.723761791" watchObservedRunningTime="2026-04-20 20:06:39.174126129 +0000 UTC m=+42.723992743" Apr 20 20:06:39.187515 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:39.187459 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fc69b6cb5-nszp8" podStartSLOduration=1.884806487 podStartE2EDuration="8.187442519s" podCreationTimestamp="2026-04-20 20:06:31 +0000 UTC" firstStartedPulling="2026-04-20 20:06:32.460390282 +0000 UTC m=+36.010256877" lastFinishedPulling="2026-04-20 20:06:38.763026317 +0000 UTC m=+42.312892909" observedRunningTime="2026-04-20 20:06:39.186999121 +0000 UTC m=+42.736865735" watchObservedRunningTime="2026-04-20 20:06:39.187442519 +0000 UTC m=+42.737309155" Apr 20 20:06:42.169528 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:42.169487 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" event={"ID":"48f66300-968d-4c37-a0eb-56fbbf9831fb","Type":"ContainerStarted","Data":"ddf96ae20285fab59d2bf5a5b3f7bacd19df276b0fc010d0bdc9be6d17cdb95a"} Apr 20 20:06:42.169528 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:42.169532 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" event={"ID":"48f66300-968d-4c37-a0eb-56fbbf9831fb","Type":"ContainerStarted","Data":"45747a7f242a61fd504a91dc937caa245a62a5f21288b3ff2abcee7e6db0a5cf"} Apr 20 20:06:42.187594 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:42.187545 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" podStartSLOduration=2.188018022 podStartE2EDuration="11.187530126s" podCreationTimestamp="2026-04-20 20:06:31 +0000 UTC" firstStartedPulling="2026-04-20 20:06:32.459213856 +0000 UTC m=+36.009080451" lastFinishedPulling="2026-04-20 20:06:41.458725961 +0000 UTC m=+45.008592555" observedRunningTime="2026-04-20 20:06:42.187247911 +0000 UTC m=+45.737114525" watchObservedRunningTime="2026-04-20 20:06:42.187530126 +0000 UTC m=+45.737396740" Apr 20 20:06:46.169747 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:46.169712 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:06:46.169747 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:46.169752 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert\") pod \"ingress-canary-566sx\" (UID: \"a5f9fd3a-20c3-49e2-860d-0b343b78d891\") " pod="openshift-ingress-canary/ingress-canary-566sx" Apr 20 20:06:46.170255 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:46.169799 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:06:46.170255 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:46.169896 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:06:46.170255 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:46.169896 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:46.170255 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:46.169910 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:46.170255 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:46.169965 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert podName:a5f9fd3a-20c3-49e2-860d-0b343b78d891 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:02.169947578 +0000 UTC m=+65.719814170 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert") pod "ingress-canary-566sx" (UID: "a5f9fd3a-20c3-49e2-860d-0b343b78d891") : secret "canary-serving-cert" not found Apr 20 20:06:46.170255 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:46.169907 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5957d944c5-6fq5k: secret "image-registry-tls" not found Apr 20 20:06:46.170255 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:46.169979 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls podName:820f3779-0686-4cab-81ea-d64fa84a9bde nodeName:}" failed. No retries permitted until 2026-04-20 20:07:02.169973703 +0000 UTC m=+65.719840294 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls") pod "dns-default-7ssck" (UID: "820f3779-0686-4cab-81ea-d64fa84a9bde") : secret "dns-default-metrics-tls" not found Apr 20 20:06:46.170255 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:06:46.170016 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls podName:f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:02.169996594 +0000 UTC m=+65.719863189 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls") pod "image-registry-5957d944c5-6fq5k" (UID: "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4") : secret "image-registry-tls" not found Apr 20 20:06:56.130922 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:06:56.130882 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-664zl" Apr 20 20:07:02.201089 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:02.201051 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:07:02.201620 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:02.201115 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:07:02.201620 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:02.201136 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert\") pod \"ingress-canary-566sx\" (UID: \"a5f9fd3a-20c3-49e2-860d-0b343b78d891\") " pod="openshift-ingress-canary/ingress-canary-566sx" Apr 20 20:07:02.201620 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:07:02.201228 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:07:02.201620 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:07:02.201237 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:07:02.201620 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:07:02.201263 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5957d944c5-6fq5k: secret "image-registry-tls" not found Apr 20 20:07:02.201620 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:07:02.201275 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:07:02.201620 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:07:02.201327 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert podName:a5f9fd3a-20c3-49e2-860d-0b343b78d891 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:34.201314012 +0000 UTC m=+97.751180617 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert") pod "ingress-canary-566sx" (UID: "a5f9fd3a-20c3-49e2-860d-0b343b78d891") : secret "canary-serving-cert" not found Apr 20 20:07:02.201620 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:07:02.201357 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls podName:820f3779-0686-4cab-81ea-d64fa84a9bde nodeName:}" failed. No retries permitted until 2026-04-20 20:07:34.201342561 +0000 UTC m=+97.751209164 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls") pod "dns-default-7ssck" (UID: "820f3779-0686-4cab-81ea-d64fa84a9bde") : secret "dns-default-metrics-tls" not found Apr 20 20:07:02.201620 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:07:02.201403 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls podName:f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:34.20138622 +0000 UTC m=+97.751252827 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls") pod "image-registry-5957d944c5-6fq5k" (UID: "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4") : secret "image-registry-tls" not found Apr 20 20:07:02.805106 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:02.805043 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs\") pod \"network-metrics-daemon-d2g4l\" (UID: \"9f213e16-074a-493b-b57c-f84483b57308\") " pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:07:02.805347 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:02.805138 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrt8k\" (UniqueName: \"kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k\") pod \"network-check-target-t9khm\" (UID: \"5409aec2-613d-49b4-aad6-5dda25f70168\") " pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:07:02.807823 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:02.807804 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:07:02.807883 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:02.807867 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:07:02.815804 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:07:02.815779 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:07:02.815915 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:07:02.815858 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs podName:9f213e16-074a-493b-b57c-f84483b57308 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:06.815837004 +0000 UTC m=+130.365703610 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs") pod "network-metrics-daemon-d2g4l" (UID: "9f213e16-074a-493b-b57c-f84483b57308") : secret "metrics-daemon-secret" not found Apr 20 20:07:02.818096 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:02.818077 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:07:02.830265 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:02.830234 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrt8k\" (UniqueName: \"kubernetes.io/projected/5409aec2-613d-49b4-aad6-5dda25f70168-kube-api-access-wrt8k\") pod \"network-check-target-t9khm\" (UID: \"5409aec2-613d-49b4-aad6-5dda25f70168\") " pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:07:02.907208 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:02.907176 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-9pkw7\"" Apr 20 20:07:02.915042 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:02.915016 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:07:03.036330 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:03.036281 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t9khm"] Apr 20 20:07:03.039406 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:07:03.039377 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5409aec2_613d_49b4_aad6_5dda25f70168.slice/crio-8972605f865e1a0ff242e9148d5950e7c2d94ce6b3dba1a62577b3dae4221f95 WatchSource:0}: Error finding container 8972605f865e1a0ff242e9148d5950e7c2d94ce6b3dba1a62577b3dae4221f95: Status 404 returned error can't find the container with id 8972605f865e1a0ff242e9148d5950e7c2d94ce6b3dba1a62577b3dae4221f95 Apr 20 20:07:03.210853 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:03.210821 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t9khm" event={"ID":"5409aec2-613d-49b4-aad6-5dda25f70168","Type":"ContainerStarted","Data":"8972605f865e1a0ff242e9148d5950e7c2d94ce6b3dba1a62577b3dae4221f95"} Apr 20 20:07:03.612218 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:03.612178 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret\") pod \"global-pull-secret-syncer-wvq2j\" (UID: \"dd310fd8-ff36-47d6-9dbf-d8d029c30747\") " pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:07:03.614964 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:03.614945 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 20:07:03.625916 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:03.625893 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dd310fd8-ff36-47d6-9dbf-d8d029c30747-original-pull-secret\") pod \"global-pull-secret-syncer-wvq2j\" (UID: \"dd310fd8-ff36-47d6-9dbf-d8d029c30747\") " pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:07:03.794540 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:03.794496 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvq2j" Apr 20 20:07:03.929813 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:03.929780 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wvq2j"] Apr 20 20:07:03.933333 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:07:03.933287 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd310fd8_ff36_47d6_9dbf_d8d029c30747.slice/crio-a360896aa0d3fbf4db122010f6fe0d68c617076d7dd60d3293393486b1914ccf WatchSource:0}: Error finding container a360896aa0d3fbf4db122010f6fe0d68c617076d7dd60d3293393486b1914ccf: Status 404 returned error can't find the container with id a360896aa0d3fbf4db122010f6fe0d68c617076d7dd60d3293393486b1914ccf Apr 20 20:07:04.214354 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:04.214264 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wvq2j" event={"ID":"dd310fd8-ff36-47d6-9dbf-d8d029c30747","Type":"ContainerStarted","Data":"a360896aa0d3fbf4db122010f6fe0d68c617076d7dd60d3293393486b1914ccf"} Apr 20 20:07:07.223314 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:07.223257 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t9khm" event={"ID":"5409aec2-613d-49b4-aad6-5dda25f70168","Type":"ContainerStarted","Data":"ca0bdf1178b098b72a1854f17966b4720c09ec9cbe6820352112a70f0b27f087"} Apr 20 20:07:07.223864 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:07.223564 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:07:07.239940 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:07.239020 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-t9khm" podStartSLOduration=66.870803368 podStartE2EDuration="1m10.23900382s" podCreationTimestamp="2026-04-20 20:05:57 +0000 UTC" firstStartedPulling="2026-04-20 20:07:03.041357568 +0000 UTC m=+66.591224163" lastFinishedPulling="2026-04-20 20:07:06.409558023 +0000 UTC m=+69.959424615" observedRunningTime="2026-04-20 20:07:07.238482446 +0000 UTC m=+70.788349061" watchObservedRunningTime="2026-04-20 20:07:07.23900382 +0000 UTC m=+70.788870471" Apr 20 20:07:08.227645 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:08.227555 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wvq2j" event={"ID":"dd310fd8-ff36-47d6-9dbf-d8d029c30747","Type":"ContainerStarted","Data":"af1af4f02b61275c9a5df8665d778e6398bd3ecf34908177e5c029771cd21e5d"} Apr 20 20:07:08.242374 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:08.242325 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-wvq2j" podStartSLOduration=65.240358225 podStartE2EDuration="1m9.242289573s" podCreationTimestamp="2026-04-20 20:05:59 +0000 UTC" firstStartedPulling="2026-04-20 20:07:03.935247775 +0000 UTC m=+67.485114367" lastFinishedPulling="2026-04-20 20:07:07.937179119 +0000 UTC m=+71.487045715" observedRunningTime="2026-04-20 20:07:08.241715915 +0000 UTC m=+71.791582542" watchObservedRunningTime="2026-04-20 20:07:08.242289573 +0000 UTC m=+71.792156189" Apr 20 20:07:34.257223 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:34.257166 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert\") pod \"ingress-canary-566sx\" (UID: \"a5f9fd3a-20c3-49e2-860d-0b343b78d891\") " pod="openshift-ingress-canary/ingress-canary-566sx" Apr 20 20:07:34.257801 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:34.257263 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:07:34.257801 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:34.257315 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:07:34.257801 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:07:34.257359 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:07:34.257801 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:07:34.257404 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:07:34.257801 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:07:34.257424 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:07:34.257801 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:07:34.257448 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5957d944c5-6fq5k: secret "image-registry-tls" not found Apr 20 20:07:34.257801 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:07:34.257434 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert podName:a5f9fd3a-20c3-49e2-860d-0b343b78d891 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:38.257419712 +0000 UTC m=+161.807286305 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert") pod "ingress-canary-566sx" (UID: "a5f9fd3a-20c3-49e2-860d-0b343b78d891") : secret "canary-serving-cert" not found Apr 20 20:07:34.257801 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:07:34.257473 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls podName:820f3779-0686-4cab-81ea-d64fa84a9bde nodeName:}" failed. No retries permitted until 2026-04-20 20:08:38.257461165 +0000 UTC m=+161.807327757 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls") pod "dns-default-7ssck" (UID: "820f3779-0686-4cab-81ea-d64fa84a9bde") : secret "dns-default-metrics-tls" not found Apr 20 20:07:34.257801 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:07:34.257514 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls podName:f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:38.257497655 +0000 UTC m=+161.807364264 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls") pod "image-registry-5957d944c5-6fq5k" (UID: "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4") : secret "image-registry-tls" not found Apr 20 20:07:38.230976 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:07:38.230944 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-t9khm" Apr 20 20:08:06.909089 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:06.909030 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs\") pod \"network-metrics-daemon-d2g4l\" (UID: \"9f213e16-074a-493b-b57c-f84483b57308\") " pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:08:06.909607 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:08:06.909171 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:08:06.909607 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:08:06.909245 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs podName:9f213e16-074a-493b-b57c-f84483b57308 nodeName:}" failed. No retries permitted until 2026-04-20 20:10:08.909229097 +0000 UTC m=+252.459095689 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs") pod "network-metrics-daemon-d2g4l" (UID: "9f213e16-074a-493b-b57c-f84483b57308") : secret "metrics-daemon-secret" not found Apr 20 20:08:14.993906 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:14.993876 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4qgql_6f6d52d5-73cf-459b-a235-e5cfe1d91c81/dns-node-resolver/0.log" Apr 20 20:08:16.198417 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:16.198388 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-dt2xp_aa0188c0-4215-47d0-a910-5a4c74cbc7cc/node-ca/0.log" Apr 20 20:08:33.305208 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:08:33.305146 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" podUID="f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4" Apr 20 20:08:33.317511 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:08:33.317459 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-7ssck" podUID="820f3779-0686-4cab-81ea-d64fa84a9bde" Apr 20 20:08:33.324677 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:08:33.324639 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-566sx" podUID="a5f9fd3a-20c3-49e2-860d-0b343b78d891" Apr 20 20:08:33.435618 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:33.433239 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-566sx" Apr 20 20:08:33.435618 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:33.433701 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:08:33.435618 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:33.434067 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7ssck" Apr 20 20:08:35.000430 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:08:35.000381 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-d2g4l" podUID="9f213e16-074a-493b-b57c-f84483b57308" Apr 20 20:08:38.349528 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.349434 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:08:38.349528 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.349477 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert\") pod \"ingress-canary-566sx\" (UID: \"a5f9fd3a-20c3-49e2-860d-0b343b78d891\") " pod="openshift-ingress-canary/ingress-canary-566sx" Apr 20 20:08:38.349988 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.349557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:08:38.352011 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.351973 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/820f3779-0686-4cab-81ea-d64fa84a9bde-metrics-tls\") pod \"dns-default-7ssck\" (UID: \"820f3779-0686-4cab-81ea-d64fa84a9bde\") " pod="openshift-dns/dns-default-7ssck" Apr 20 20:08:38.352159 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.352112 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls\") pod \"image-registry-5957d944c5-6fq5k\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:08:38.352159 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.352126 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5f9fd3a-20c3-49e2-860d-0b343b78d891-cert\") pod \"ingress-canary-566sx\" (UID: \"a5f9fd3a-20c3-49e2-860d-0b343b78d891\") " pod="openshift-ingress-canary/ingress-canary-566sx" Apr 20 20:08:38.429215 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.429176 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-58cpq"] Apr 20 20:08:38.432391 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.432371 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-58cpq" Apr 20 20:08:38.436729 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.436702 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 20:08:38.436880 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.436734 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 20:08:38.436880 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.436781 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 20:08:38.436880 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.436734 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jp4bq\"" Apr 20 20:08:38.436880 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.436823 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 20:08:38.467860 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.467826 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-58cpq"] Apr 20 20:08:38.537135 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.537106 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vhk8x\"" Apr 20 20:08:38.537398 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.537326 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xwtbs\"" Apr 20 20:08:38.537825 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.537795 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dkw75\"" Apr 20 20:08:38.543965 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.543942 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-566sx" Apr 20 20:08:38.545671 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.545648 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:08:38.545788 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.545760 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7ssck" Apr 20 20:08:38.551563 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.551529 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2ed5cae5-de16-4920-ba49-c4347715ab85-data-volume\") pod \"insights-runtime-extractor-58cpq\" (UID: \"2ed5cae5-de16-4920-ba49-c4347715ab85\") " pod="openshift-insights/insights-runtime-extractor-58cpq" Apr 20 20:08:38.551677 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.551573 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2ed5cae5-de16-4920-ba49-c4347715ab85-crio-socket\") pod \"insights-runtime-extractor-58cpq\" (UID: \"2ed5cae5-de16-4920-ba49-c4347715ab85\") " pod="openshift-insights/insights-runtime-extractor-58cpq" Apr 20 20:08:38.551776 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.551669 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkjrw\" (UniqueName: \"kubernetes.io/projected/2ed5cae5-de16-4920-ba49-c4347715ab85-kube-api-access-qkjrw\") pod \"insights-runtime-extractor-58cpq\" (UID: \"2ed5cae5-de16-4920-ba49-c4347715ab85\") " pod="openshift-insights/insights-runtime-extractor-58cpq" Apr 20 20:08:38.551776 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.551712 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2ed5cae5-de16-4920-ba49-c4347715ab85-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-58cpq\" (UID: \"2ed5cae5-de16-4920-ba49-c4347715ab85\") " pod="openshift-insights/insights-runtime-extractor-58cpq" Apr 20 20:08:38.551776 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.551751 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2ed5cae5-de16-4920-ba49-c4347715ab85-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-58cpq\" (UID: \"2ed5cae5-de16-4920-ba49-c4347715ab85\") " pod="openshift-insights/insights-runtime-extractor-58cpq" Apr 20 20:08:38.652137 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.652109 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkjrw\" (UniqueName: \"kubernetes.io/projected/2ed5cae5-de16-4920-ba49-c4347715ab85-kube-api-access-qkjrw\") pod \"insights-runtime-extractor-58cpq\" (UID: \"2ed5cae5-de16-4920-ba49-c4347715ab85\") " pod="openshift-insights/insights-runtime-extractor-58cpq" Apr 20 20:08:38.652253 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.652156 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2ed5cae5-de16-4920-ba49-c4347715ab85-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-58cpq\" (UID: \"2ed5cae5-de16-4920-ba49-c4347715ab85\") " pod="openshift-insights/insights-runtime-extractor-58cpq" Apr 20 20:08:38.652253 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.652212 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2ed5cae5-de16-4920-ba49-c4347715ab85-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-58cpq\" (UID: \"2ed5cae5-de16-4920-ba49-c4347715ab85\") " pod="openshift-insights/insights-runtime-extractor-58cpq" Apr 20 20:08:38.652380 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.652250 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2ed5cae5-de16-4920-ba49-c4347715ab85-data-volume\") pod \"insights-runtime-extractor-58cpq\" (UID: \"2ed5cae5-de16-4920-ba49-c4347715ab85\") " pod="openshift-insights/insights-runtime-extractor-58cpq" Apr 20 20:08:38.652380 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.652278 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2ed5cae5-de16-4920-ba49-c4347715ab85-crio-socket\") pod \"insights-runtime-extractor-58cpq\" (UID: \"2ed5cae5-de16-4920-ba49-c4347715ab85\") " pod="openshift-insights/insights-runtime-extractor-58cpq" Apr 20 20:08:38.652472 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.652395 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2ed5cae5-de16-4920-ba49-c4347715ab85-crio-socket\") pod \"insights-runtime-extractor-58cpq\" (UID: \"2ed5cae5-de16-4920-ba49-c4347715ab85\") " pod="openshift-insights/insights-runtime-extractor-58cpq" Apr 20 20:08:38.652770 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.652743 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2ed5cae5-de16-4920-ba49-c4347715ab85-data-volume\") pod \"insights-runtime-extractor-58cpq\" (UID: \"2ed5cae5-de16-4920-ba49-c4347715ab85\") " pod="openshift-insights/insights-runtime-extractor-58cpq" Apr 20 20:08:38.652900 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.652881 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2ed5cae5-de16-4920-ba49-c4347715ab85-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-58cpq\" (UID: \"2ed5cae5-de16-4920-ba49-c4347715ab85\") " pod="openshift-insights/insights-runtime-extractor-58cpq" Apr 20 20:08:38.655329 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.655281 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2ed5cae5-de16-4920-ba49-c4347715ab85-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-58cpq\" (UID: \"2ed5cae5-de16-4920-ba49-c4347715ab85\") " pod="openshift-insights/insights-runtime-extractor-58cpq" Apr 20 20:08:38.663040 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.662962 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkjrw\" (UniqueName: \"kubernetes.io/projected/2ed5cae5-de16-4920-ba49-c4347715ab85-kube-api-access-qkjrw\") pod \"insights-runtime-extractor-58cpq\" (UID: \"2ed5cae5-de16-4920-ba49-c4347715ab85\") " pod="openshift-insights/insights-runtime-extractor-58cpq" Apr 20 20:08:38.707350 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.707316 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-566sx"] Apr 20 20:08:38.711742 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:08:38.711710 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5f9fd3a_20c3_49e2_860d_0b343b78d891.slice/crio-f04da669e27680aca3f1ddce2956f217c931d20a85f4787ecfbad7ec56c70ae2 WatchSource:0}: Error finding container f04da669e27680aca3f1ddce2956f217c931d20a85f4787ecfbad7ec56c70ae2: Status 404 returned error can't find the container with id f04da669e27680aca3f1ddce2956f217c931d20a85f4787ecfbad7ec56c70ae2 Apr 20 20:08:38.742426 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.742393 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-58cpq" Apr 20 20:08:38.875827 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.875745 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-58cpq"] Apr 20 20:08:38.878775 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:08:38.878741 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ed5cae5_de16_4920_ba49_c4347715ab85.slice/crio-7264dcacaf730c3cea1e71b192376d69266326336ff7313de4a5dedd2ccd4267 WatchSource:0}: Error finding container 7264dcacaf730c3cea1e71b192376d69266326336ff7313de4a5dedd2ccd4267: Status 404 returned error can't find the container with id 7264dcacaf730c3cea1e71b192376d69266326336ff7313de4a5dedd2ccd4267 Apr 20 20:08:38.924817 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.924773 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7ssck"] Apr 20 20:08:38.925530 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:38.925505 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5957d944c5-6fq5k"] Apr 20 20:08:38.928037 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:08:38.928000 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod820f3779_0686_4cab_81ea_d64fa84a9bde.slice/crio-3c1ebf506aa0e1aefac1519fc06abab2c3af0b7514becf9af956caf21d245b04 WatchSource:0}: Error finding container 3c1ebf506aa0e1aefac1519fc06abab2c3af0b7514becf9af956caf21d245b04: Status 404 returned error can't find the container with id 3c1ebf506aa0e1aefac1519fc06abab2c3af0b7514becf9af956caf21d245b04 Apr 20 20:08:38.928531 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:08:38.928505 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5b2aa87_dcc2_4dd6_b360_7f4c8f75fcf4.slice/crio-a45fda0ca08d05a742b097a1c450780576f20c1825bb242ab0a5fec0b999e9b7 WatchSource:0}: Error finding container a45fda0ca08d05a742b097a1c450780576f20c1825bb242ab0a5fec0b999e9b7: Status 404 returned error can't find the container with id a45fda0ca08d05a742b097a1c450780576f20c1825bb242ab0a5fec0b999e9b7 Apr 20 20:08:39.161089 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:39.160956 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" podUID="d6fb0dcc-9502-494c-86b3-da5eaae6b213" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.10:8000/readyz\": dial tcp 10.132.0.10:8000: connect: connection refused" Apr 20 20:08:39.448568 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:39.448475 2573 generic.go:358] "Generic (PLEG): container finished" podID="d6fb0dcc-9502-494c-86b3-da5eaae6b213" containerID="592a7d9ddb2d51faaf7db698d33e59e92c9790331629d67d3ce9f6b640c8497d" exitCode=1 Apr 20 20:08:39.449081 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:39.448571 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" event={"ID":"d6fb0dcc-9502-494c-86b3-da5eaae6b213","Type":"ContainerDied","Data":"592a7d9ddb2d51faaf7db698d33e59e92c9790331629d67d3ce9f6b640c8497d"} Apr 20 20:08:39.449081 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:39.448939 2573 scope.go:117] "RemoveContainer" containerID="592a7d9ddb2d51faaf7db698d33e59e92c9790331629d67d3ce9f6b640c8497d" Apr 20 20:08:39.450854 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:39.450821 2573 generic.go:358] "Generic (PLEG): container finished" podID="0dea69e5-e005-480f-b6f4-45b6319564ed" containerID="9a7fae2aab48fc8887c33e28933967fe928977b7592e7abc54d7498244a83556" exitCode=255 Apr 20 20:08:39.450963 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:39.450907 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fc69b6cb5-nszp8" event={"ID":"0dea69e5-e005-480f-b6f4-45b6319564ed","Type":"ContainerDied","Data":"9a7fae2aab48fc8887c33e28933967fe928977b7592e7abc54d7498244a83556"} Apr 20 20:08:39.451280 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:39.451260 2573 scope.go:117] "RemoveContainer" containerID="9a7fae2aab48fc8887c33e28933967fe928977b7592e7abc54d7498244a83556" Apr 20 20:08:39.454784 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:39.454753 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" event={"ID":"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4","Type":"ContainerStarted","Data":"7b7da5076d0244b3b39d3695e55e36ecb30dc4bf4653c4bff78fedf1295cde17"} Apr 20 20:08:39.454926 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:39.454796 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" event={"ID":"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4","Type":"ContainerStarted","Data":"a45fda0ca08d05a742b097a1c450780576f20c1825bb242ab0a5fec0b999e9b7"} Apr 20 20:08:39.455154 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:39.455023 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:08:39.456920 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:39.456835 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-566sx" event={"ID":"a5f9fd3a-20c3-49e2-860d-0b343b78d891","Type":"ContainerStarted","Data":"f04da669e27680aca3f1ddce2956f217c931d20a85f4787ecfbad7ec56c70ae2"} Apr 20 20:08:39.458769 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:39.458740 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-58cpq" event={"ID":"2ed5cae5-de16-4920-ba49-c4347715ab85","Type":"ContainerStarted","Data":"644e7661a78c11e1f8a155f017f590f18ff5aacaf002f4338174c7f5be675e5c"} Apr 20 20:08:39.458881 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:39.458776 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-58cpq" event={"ID":"2ed5cae5-de16-4920-ba49-c4347715ab85","Type":"ContainerStarted","Data":"7264dcacaf730c3cea1e71b192376d69266326336ff7313de4a5dedd2ccd4267"} Apr 20 20:08:39.460665 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:39.460613 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7ssck" event={"ID":"820f3779-0686-4cab-81ea-d64fa84a9bde","Type":"ContainerStarted","Data":"3c1ebf506aa0e1aefac1519fc06abab2c3af0b7514becf9af956caf21d245b04"} Apr 20 20:08:39.512606 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:39.512556 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" podStartSLOduration=162.512538998 podStartE2EDuration="2m42.512538998s" podCreationTimestamp="2026-04-20 20:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:08:39.509958443 +0000 UTC m=+163.059825080" watchObservedRunningTime="2026-04-20 20:08:39.512538998 +0000 UTC m=+163.062405629" Apr 20 20:08:40.465349 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:40.465286 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-58cpq" event={"ID":"2ed5cae5-de16-4920-ba49-c4347715ab85","Type":"ContainerStarted","Data":"746710b66def75e73eb64b0204a8f54bfee83814893241169cf8ea31cd97b2ce"} Apr 20 20:08:40.467042 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:40.467004 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" event={"ID":"d6fb0dcc-9502-494c-86b3-da5eaae6b213","Type":"ContainerStarted","Data":"bc84c132a0c42e9969c702d158e6bb2b71a6116f3097ecee5b0b3afea3c49009"} Apr 20 20:08:40.467371 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:40.467349 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" Apr 20 20:08:40.468046 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:40.468027 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9b7cbdf44-wdk2n" Apr 20 20:08:40.469037 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:40.469013 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fc69b6cb5-nszp8" event={"ID":"0dea69e5-e005-480f-b6f4-45b6319564ed","Type":"ContainerStarted","Data":"156a827b4118a0ed9ba0dd532a2faa28dbf9e0ceca7af2b144ca482eef29d876"} Apr 20 20:08:41.479933 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:41.479882 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-566sx" event={"ID":"a5f9fd3a-20c3-49e2-860d-0b343b78d891","Type":"ContainerStarted","Data":"10a7dea35ef814b142f609c5124e550cd43171404da19aaabaff06726504ed2e"} Apr 20 20:08:41.481908 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:41.481873 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7ssck" event={"ID":"820f3779-0686-4cab-81ea-d64fa84a9bde","Type":"ContainerStarted","Data":"ff2097b2786c49f45082ee3eaa82732ae67caee1a7a93f5e9a374e7fd2dbe9b3"} Apr 20 20:08:41.482072 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:41.482004 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7ssck" event={"ID":"820f3779-0686-4cab-81ea-d64fa84a9bde","Type":"ContainerStarted","Data":"1be050ccf755a4f0a0aef96c6e9b1a21071a4a2332a4a2f7ee8517d8690df658"} Apr 20 20:08:41.496731 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:41.496651 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-566sx" podStartSLOduration=129.373776708 podStartE2EDuration="2m11.496635554s" podCreationTimestamp="2026-04-20 20:06:30 +0000 UTC" firstStartedPulling="2026-04-20 20:08:38.713504022 +0000 UTC m=+162.263370615" lastFinishedPulling="2026-04-20 20:08:40.836362864 +0000 UTC m=+164.386229461" observedRunningTime="2026-04-20 20:08:41.496253297 +0000 UTC m=+165.046119913" watchObservedRunningTime="2026-04-20 20:08:41.496635554 +0000 UTC m=+165.046502168" Apr 20 20:08:41.516764 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:41.516703 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7ssck" podStartSLOduration=129.608362699 podStartE2EDuration="2m11.516685026s" podCreationTimestamp="2026-04-20 20:06:30 +0000 UTC" firstStartedPulling="2026-04-20 20:08:38.929943134 +0000 UTC m=+162.479809729" lastFinishedPulling="2026-04-20 20:08:40.838265464 +0000 UTC m=+164.388132056" observedRunningTime="2026-04-20 20:08:41.515828432 +0000 UTC m=+165.065695050" watchObservedRunningTime="2026-04-20 20:08:41.516685026 +0000 UTC m=+165.066551639" Apr 20 20:08:42.486922 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:42.486881 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-58cpq" event={"ID":"2ed5cae5-de16-4920-ba49-c4347715ab85","Type":"ContainerStarted","Data":"b779486fc04774f36a46abb3bad3ab3db8159e629fbe4c6f9139a3cbbf626f96"} Apr 20 20:08:42.487481 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:42.487462 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7ssck" Apr 20 20:08:42.506648 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:42.506596 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-58cpq" podStartSLOduration=1.577765874 podStartE2EDuration="4.506580884s" podCreationTimestamp="2026-04-20 20:08:38 +0000 UTC" firstStartedPulling="2026-04-20 20:08:38.938416442 +0000 UTC m=+162.488283043" lastFinishedPulling="2026-04-20 20:08:41.867231459 +0000 UTC m=+165.417098053" observedRunningTime="2026-04-20 20:08:42.505227263 +0000 UTC m=+166.055093903" watchObservedRunningTime="2026-04-20 20:08:42.506580884 +0000 UTC m=+166.056447498" Apr 20 20:08:47.973253 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:47.973215 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:08:51.942561 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:51.942527 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-fmx9j"] Apr 20 20:08:51.947176 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:51.947159 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:51.949631 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:51.949606 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 20:08:51.949758 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:51.949694 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7n892\"" Apr 20 20:08:51.949758 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:51.949722 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 20:08:51.950192 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:51.950179 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 20:08:51.950242 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:51.950211 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 20:08:51.950283 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:51.950251 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 20:08:51.950283 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:51.950212 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 20:08:52.062536 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.062503 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e818016d-1b0a-49e0-8d0c-80e383b686e8-node-exporter-textfile\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.062536 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.062541 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e818016d-1b0a-49e0-8d0c-80e383b686e8-metrics-client-ca\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.062746 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.062560 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e818016d-1b0a-49e0-8d0c-80e383b686e8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.062746 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.062624 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdlzg\" (UniqueName: \"kubernetes.io/projected/e818016d-1b0a-49e0-8d0c-80e383b686e8-kube-api-access-mdlzg\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.062746 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.062668 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e818016d-1b0a-49e0-8d0c-80e383b686e8-root\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.062746 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.062690 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e818016d-1b0a-49e0-8d0c-80e383b686e8-node-exporter-wtmp\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.062746 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.062735 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e818016d-1b0a-49e0-8d0c-80e383b686e8-sys\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.062893 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.062799 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e818016d-1b0a-49e0-8d0c-80e383b686e8-node-exporter-tls\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.062893 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.062832 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e818016d-1b0a-49e0-8d0c-80e383b686e8-node-exporter-accelerators-collector-config\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.163850 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.163812 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e818016d-1b0a-49e0-8d0c-80e383b686e8-sys\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.163850 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.163854 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e818016d-1b0a-49e0-8d0c-80e383b686e8-node-exporter-tls\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.164057 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.163876 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e818016d-1b0a-49e0-8d0c-80e383b686e8-node-exporter-accelerators-collector-config\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.164057 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.163928 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e818016d-1b0a-49e0-8d0c-80e383b686e8-sys\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.164057 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:08:52.163976 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 20:08:52.164057 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.164012 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e818016d-1b0a-49e0-8d0c-80e383b686e8-node-exporter-textfile\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.164057 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:08:52.164040 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e818016d-1b0a-49e0-8d0c-80e383b686e8-node-exporter-tls podName:e818016d-1b0a-49e0-8d0c-80e383b686e8 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:52.66402402 +0000 UTC m=+176.213890631 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e818016d-1b0a-49e0-8d0c-80e383b686e8-node-exporter-tls") pod "node-exporter-fmx9j" (UID: "e818016d-1b0a-49e0-8d0c-80e383b686e8") : secret "node-exporter-tls" not found Apr 20 20:08:52.164237 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.164063 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e818016d-1b0a-49e0-8d0c-80e383b686e8-metrics-client-ca\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.164237 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.164088 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e818016d-1b0a-49e0-8d0c-80e383b686e8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.164237 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.164110 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdlzg\" (UniqueName: \"kubernetes.io/projected/e818016d-1b0a-49e0-8d0c-80e383b686e8-kube-api-access-mdlzg\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.164410 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.164272 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e818016d-1b0a-49e0-8d0c-80e383b686e8-root\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.164410 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.164374 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e818016d-1b0a-49e0-8d0c-80e383b686e8-node-exporter-textfile\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.164410 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.164375 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e818016d-1b0a-49e0-8d0c-80e383b686e8-node-exporter-wtmp\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.164552 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.164391 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e818016d-1b0a-49e0-8d0c-80e383b686e8-root\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.164552 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.164500 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e818016d-1b0a-49e0-8d0c-80e383b686e8-node-exporter-wtmp\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.164616 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.164561 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e818016d-1b0a-49e0-8d0c-80e383b686e8-node-exporter-accelerators-collector-config\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.164616 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.164599 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e818016d-1b0a-49e0-8d0c-80e383b686e8-metrics-client-ca\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.166597 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.166572 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e818016d-1b0a-49e0-8d0c-80e383b686e8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.173421 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.173397 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdlzg\" (UniqueName: \"kubernetes.io/projected/e818016d-1b0a-49e0-8d0c-80e383b686e8-kube-api-access-mdlzg\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.492893 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.492865 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7ssck" Apr 20 20:08:52.670190 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.670154 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e818016d-1b0a-49e0-8d0c-80e383b686e8-node-exporter-tls\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.672591 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.672566 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e818016d-1b0a-49e0-8d0c-80e383b686e8-node-exporter-tls\") pod \"node-exporter-fmx9j\" (UID: \"e818016d-1b0a-49e0-8d0c-80e383b686e8\") " pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.856108 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:52.856075 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fmx9j" Apr 20 20:08:52.867570 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:08:52.867540 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode818016d_1b0a_49e0_8d0c_80e383b686e8.slice/crio-a2b0b8d10acdb07dc419f302af89f72227f73cb54f8113d54a131dfaba4838d5 WatchSource:0}: Error finding container a2b0b8d10acdb07dc419f302af89f72227f73cb54f8113d54a131dfaba4838d5: Status 404 returned error can't find the container with id a2b0b8d10acdb07dc419f302af89f72227f73cb54f8113d54a131dfaba4838d5 Apr 20 20:08:53.517113 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:53.517084 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fmx9j" event={"ID":"e818016d-1b0a-49e0-8d0c-80e383b686e8","Type":"ContainerStarted","Data":"a2b0b8d10acdb07dc419f302af89f72227f73cb54f8113d54a131dfaba4838d5"} Apr 20 20:08:54.521841 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:54.521803 2573 generic.go:358] "Generic (PLEG): container finished" podID="e818016d-1b0a-49e0-8d0c-80e383b686e8" containerID="4be6013cd49c52d18858f03046bdfb151c8fb2b04f3fe16179c8d55e926ebf1b" exitCode=0 Apr 20 20:08:54.522271 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:54.521895 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fmx9j" event={"ID":"e818016d-1b0a-49e0-8d0c-80e383b686e8","Type":"ContainerDied","Data":"4be6013cd49c52d18858f03046bdfb151c8fb2b04f3fe16179c8d55e926ebf1b"} Apr 20 20:08:55.526090 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:55.526053 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fmx9j" event={"ID":"e818016d-1b0a-49e0-8d0c-80e383b686e8","Type":"ContainerStarted","Data":"96a80f3691587bc79d43387224ea8c53f80aaee7d2d40446a405ab5132420902"} Apr 20 20:08:55.526090 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:55.526094 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fmx9j" event={"ID":"e818016d-1b0a-49e0-8d0c-80e383b686e8","Type":"ContainerStarted","Data":"a42007271402c95e7cb3e1baae85fc3d11eab0fdee64e57e2cf22a40129cb80a"} Apr 20 20:08:55.547364 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:55.547289 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-fmx9j" podStartSLOduration=3.902256924 podStartE2EDuration="4.547273821s" podCreationTimestamp="2026-04-20 20:08:51 +0000 UTC" firstStartedPulling="2026-04-20 20:08:52.870222009 +0000 UTC m=+176.420088615" lastFinishedPulling="2026-04-20 20:08:53.515238915 +0000 UTC m=+177.065105512" observedRunningTime="2026-04-20 20:08:55.545823441 +0000 UTC m=+179.095690054" watchObservedRunningTime="2026-04-20 20:08:55.547273821 +0000 UTC m=+179.097140457" Apr 20 20:08:58.550142 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:58.550103 2573 patch_prober.go:28] interesting pod/image-registry-5957d944c5-6fq5k container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 20:08:58.550620 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:08:58.550166 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" podUID="f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:09:00.243638 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:00.243601 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5957d944c5-6fq5k"] Apr 20 20:09:00.247758 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:00.247732 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:09:12.249591 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:12.249547 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" podUID="48f66300-968d-4c37-a0eb-56fbbf9831fb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 20:09:22.249676 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:22.249622 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" podUID="48f66300-968d-4c37-a0eb-56fbbf9831fb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 20:09:25.262992 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.262926 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" podUID="f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4" containerName="registry" containerID="cri-o://7b7da5076d0244b3b39d3695e55e36ecb30dc4bf4653c4bff78fedf1295cde17" gracePeriod=30 Apr 20 20:09:25.496090 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.496067 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:09:25.534105 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.534021 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-bound-sa-token\") pod \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " Apr 20 20:09:25.534105 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.534066 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-installation-pull-secrets\") pod \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " Apr 20 20:09:25.534105 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.534103 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls\") pod \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " Apr 20 20:09:25.534451 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.534142 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwm2l\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-kube-api-access-gwm2l\") pod \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " Apr 20 20:09:25.534451 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.534169 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-trusted-ca\") pod \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " Apr 20 20:09:25.534451 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.534196 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-certificates\") pod \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " Apr 20 20:09:25.534451 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.534238 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-image-registry-private-configuration\") pod \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " Apr 20 20:09:25.534451 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.534274 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-ca-trust-extracted\") pod \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\" (UID: \"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4\") " Apr 20 20:09:25.534782 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.534745 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4" (UID: "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:25.535274 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.535221 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4" (UID: "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:25.536823 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.536779 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4" (UID: "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:25.536976 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.536957 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4" (UID: "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:25.536976 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.536957 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-kube-api-access-gwm2l" (OuterVolumeSpecName: "kube-api-access-gwm2l") pod "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4" (UID: "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4"). InnerVolumeSpecName "kube-api-access-gwm2l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:25.537097 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.536975 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4" (UID: "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:25.537194 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.537176 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4" (UID: "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:25.545145 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.545112 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4" (UID: "f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:09:25.604686 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.604646 2573 generic.go:358] "Generic (PLEG): container finished" podID="f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4" containerID="7b7da5076d0244b3b39d3695e55e36ecb30dc4bf4653c4bff78fedf1295cde17" exitCode=0 Apr 20 20:09:25.604837 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.604715 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" Apr 20 20:09:25.604837 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.604733 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" event={"ID":"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4","Type":"ContainerDied","Data":"7b7da5076d0244b3b39d3695e55e36ecb30dc4bf4653c4bff78fedf1295cde17"} Apr 20 20:09:25.604837 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.604776 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5957d944c5-6fq5k" event={"ID":"f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4","Type":"ContainerDied","Data":"a45fda0ca08d05a742b097a1c450780576f20c1825bb242ab0a5fec0b999e9b7"} Apr 20 20:09:25.604837 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.604792 2573 scope.go:117] "RemoveContainer" containerID="7b7da5076d0244b3b39d3695e55e36ecb30dc4bf4653c4bff78fedf1295cde17" Apr 20 20:09:25.613028 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.613011 2573 scope.go:117] "RemoveContainer" containerID="7b7da5076d0244b3b39d3695e55e36ecb30dc4bf4653c4bff78fedf1295cde17" Apr 20 20:09:25.613320 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:09:25.613277 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b7da5076d0244b3b39d3695e55e36ecb30dc4bf4653c4bff78fedf1295cde17\": container with ID starting with 7b7da5076d0244b3b39d3695e55e36ecb30dc4bf4653c4bff78fedf1295cde17 not found: ID does not exist" containerID="7b7da5076d0244b3b39d3695e55e36ecb30dc4bf4653c4bff78fedf1295cde17" Apr 20 20:09:25.613393 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.613328 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b7da5076d0244b3b39d3695e55e36ecb30dc4bf4653c4bff78fedf1295cde17"} err="failed to get container status \"7b7da5076d0244b3b39d3695e55e36ecb30dc4bf4653c4bff78fedf1295cde17\": rpc error: code = NotFound desc = could not find container \"7b7da5076d0244b3b39d3695e55e36ecb30dc4bf4653c4bff78fedf1295cde17\": container with ID starting with 7b7da5076d0244b3b39d3695e55e36ecb30dc4bf4653c4bff78fedf1295cde17 not found: ID does not exist" Apr 20 20:09:25.624160 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.624138 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5957d944c5-6fq5k"] Apr 20 20:09:25.628164 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.628139 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5957d944c5-6fq5k"] Apr 20 20:09:25.635528 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.635503 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-bound-sa-token\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 20 20:09:25.635622 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.635531 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-installation-pull-secrets\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 20 20:09:25.635622 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.635548 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-tls\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 20 20:09:25.635622 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.635558 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gwm2l\" (UniqueName: \"kubernetes.io/projected/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-kube-api-access-gwm2l\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 20 20:09:25.635622 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.635567 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-trusted-ca\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 20 20:09:25.635622 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.635576 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-registry-certificates\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 20 20:09:25.635622 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.635585 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-image-registry-private-configuration\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 20 20:09:25.635622 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:25.635594 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4-ca-trust-extracted\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 20 20:09:26.976203 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:26.976171 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4" path="/var/lib/kubelet/pods/f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4/volumes" Apr 20 20:09:32.249152 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:32.249107 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" podUID="48f66300-968d-4c37-a0eb-56fbbf9831fb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 20:09:32.249550 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:32.249206 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" Apr 20 20:09:32.249729 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:32.249711 2573 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"ddf96ae20285fab59d2bf5a5b3f7bacd19df276b0fc010d0bdc9be6d17cdb95a"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 20:09:32.249765 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:32.249749 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" podUID="48f66300-968d-4c37-a0eb-56fbbf9831fb" containerName="service-proxy" containerID="cri-o://ddf96ae20285fab59d2bf5a5b3f7bacd19df276b0fc010d0bdc9be6d17cdb95a" gracePeriod=30 Apr 20 20:09:32.623928 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:32.623899 2573 generic.go:358] "Generic (PLEG): container finished" podID="48f66300-968d-4c37-a0eb-56fbbf9831fb" containerID="ddf96ae20285fab59d2bf5a5b3f7bacd19df276b0fc010d0bdc9be6d17cdb95a" exitCode=2 Apr 20 20:09:32.624086 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:32.623963 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" event={"ID":"48f66300-968d-4c37-a0eb-56fbbf9831fb","Type":"ContainerDied","Data":"ddf96ae20285fab59d2bf5a5b3f7bacd19df276b0fc010d0bdc9be6d17cdb95a"} Apr 20 20:09:32.624086 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:09:32.624001 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f999bc4c-g2qtb" event={"ID":"48f66300-968d-4c37-a0eb-56fbbf9831fb","Type":"ContainerStarted","Data":"f588ca346ec358128b8510daabc82ec1882e5d08e50c0fa2a8a3ad2827e221ff"} Apr 20 20:10:08.953662 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:10:08.953574 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs\") pod \"network-metrics-daemon-d2g4l\" (UID: \"9f213e16-074a-493b-b57c-f84483b57308\") " pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:10:08.956081 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:10:08.956057 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f213e16-074a-493b-b57c-f84483b57308-metrics-certs\") pod \"network-metrics-daemon-d2g4l\" (UID: \"9f213e16-074a-493b-b57c-f84483b57308\") " pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:10:08.976418 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:10:08.976393 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4tpsm\"" Apr 20 20:10:08.984348 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:10:08.984331 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d2g4l" Apr 20 20:10:09.103915 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:10:09.103881 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d2g4l"] Apr 20 20:10:09.108223 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:10:09.108174 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f213e16_074a_493b_b57c_f84483b57308.slice/crio-c95b1c5c0391d084fe8b336d38d82e4410bfa798cfb5b342722083236689ac94 WatchSource:0}: Error finding container c95b1c5c0391d084fe8b336d38d82e4410bfa798cfb5b342722083236689ac94: Status 404 returned error can't find the container with id c95b1c5c0391d084fe8b336d38d82e4410bfa798cfb5b342722083236689ac94 Apr 20 20:10:09.719716 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:10:09.719681 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d2g4l" event={"ID":"9f213e16-074a-493b-b57c-f84483b57308","Type":"ContainerStarted","Data":"c95b1c5c0391d084fe8b336d38d82e4410bfa798cfb5b342722083236689ac94"} Apr 20 20:10:10.724008 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:10:10.723914 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d2g4l" event={"ID":"9f213e16-074a-493b-b57c-f84483b57308","Type":"ContainerStarted","Data":"fb8e10498a7cb1204e05e762801f69d86441f7f6136d9cd3f963b72cbade1c18"} Apr 20 20:10:10.724008 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:10:10.723959 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d2g4l" event={"ID":"9f213e16-074a-493b-b57c-f84483b57308","Type":"ContainerStarted","Data":"fe23fd2cc4f1c43bdefa32ff65e8acaf07274a6f7649c073775730742b845336"} Apr 20 20:10:10.743890 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:10:10.743839 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-d2g4l" podStartSLOduration=252.444440559 podStartE2EDuration="4m13.743819619s" podCreationTimestamp="2026-04-20 20:05:57 +0000 UTC" firstStartedPulling="2026-04-20 20:10:09.109934647 +0000 UTC m=+252.659801240" lastFinishedPulling="2026-04-20 20:10:10.409313705 +0000 UTC m=+253.959180300" observedRunningTime="2026-04-20 20:10:10.742119783 +0000 UTC m=+254.291986396" watchObservedRunningTime="2026-04-20 20:10:10.743819619 +0000 UTC m=+254.293686272" Apr 20 20:10:56.861045 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:10:56.861012 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/ovn-acl-logging/0.log" Apr 20 20:10:56.861573 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:10:56.861133 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/ovn-acl-logging/0.log" Apr 20 20:10:56.864480 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:10:56.864457 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 20:11:25.675709 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.675672 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw"] Apr 20 20:11:25.678090 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.675906 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4" containerName="registry" Apr 20 20:11:25.678090 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.675916 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4" containerName="registry" Apr 20 20:11:25.678090 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.675960 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5b2aa87-dcc2-4dd6-b360-7f4c8f75fcf4" containerName="registry" Apr 20 20:11:25.678967 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.678951 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" Apr 20 20:11:25.683747 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.683714 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 20 20:11:25.683901 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.683764 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-m9cwt\"" Apr 20 20:11:25.683901 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.683855 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 20 20:11:25.684691 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.684675 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 20 20:11:25.684766 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.684696 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 20 20:11:25.684811 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.684794 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 20 20:11:25.698129 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.698096 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw"] Apr 20 20:11:25.783440 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.783398 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b26e915f-c347-4d6b-b992-4b1ec7d13d40-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vvvmw\" (UID: \"b26e915f-c347-4d6b-b992-4b1ec7d13d40\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" Apr 20 20:11:25.783440 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.783438 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2wkf\" (UniqueName: \"kubernetes.io/projected/b26e915f-c347-4d6b-b992-4b1ec7d13d40-kube-api-access-w2wkf\") pod \"keda-metrics-apiserver-7c9f485588-vvvmw\" (UID: \"b26e915f-c347-4d6b-b992-4b1ec7d13d40\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" Apr 20 20:11:25.783726 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.783484 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/b26e915f-c347-4d6b-b992-4b1ec7d13d40-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-vvvmw\" (UID: \"b26e915f-c347-4d6b-b992-4b1ec7d13d40\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" Apr 20 20:11:25.884775 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.884738 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2wkf\" (UniqueName: \"kubernetes.io/projected/b26e915f-c347-4d6b-b992-4b1ec7d13d40-kube-api-access-w2wkf\") pod \"keda-metrics-apiserver-7c9f485588-vvvmw\" (UID: \"b26e915f-c347-4d6b-b992-4b1ec7d13d40\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" Apr 20 20:11:25.884989 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.884804 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/b26e915f-c347-4d6b-b992-4b1ec7d13d40-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-vvvmw\" (UID: \"b26e915f-c347-4d6b-b992-4b1ec7d13d40\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" Apr 20 20:11:25.884989 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.884856 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b26e915f-c347-4d6b-b992-4b1ec7d13d40-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vvvmw\" (UID: \"b26e915f-c347-4d6b-b992-4b1ec7d13d40\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" Apr 20 20:11:25.884989 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:25.884945 2573 secret.go:281] references non-existent secret key: tls.crt Apr 20 20:11:25.884989 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:25.884959 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 20 20:11:25.884989 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:25.884981 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw: references non-existent secret key: tls.crt Apr 20 20:11:25.885246 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:25.885049 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b26e915f-c347-4d6b-b992-4b1ec7d13d40-certificates podName:b26e915f-c347-4d6b-b992-4b1ec7d13d40 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:26.385029831 +0000 UTC m=+329.934896425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b26e915f-c347-4d6b-b992-4b1ec7d13d40-certificates") pod "keda-metrics-apiserver-7c9f485588-vvvmw" (UID: "b26e915f-c347-4d6b-b992-4b1ec7d13d40") : references non-existent secret key: tls.crt Apr 20 20:11:25.885356 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.885323 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/b26e915f-c347-4d6b-b992-4b1ec7d13d40-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-vvvmw\" (UID: \"b26e915f-c347-4d6b-b992-4b1ec7d13d40\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" Apr 20 20:11:25.894845 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.894822 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2wkf\" (UniqueName: \"kubernetes.io/projected/b26e915f-c347-4d6b-b992-4b1ec7d13d40-kube-api-access-w2wkf\") pod \"keda-metrics-apiserver-7c9f485588-vvvmw\" (UID: \"b26e915f-c347-4d6b-b992-4b1ec7d13d40\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" Apr 20 20:11:25.963380 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.963280 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-xgntr"] Apr 20 20:11:25.966556 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.966539 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-xgntr" Apr 20 20:11:25.968822 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.968801 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 20 20:11:25.977755 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.977732 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-xgntr"] Apr 20 20:11:25.985865 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.985838 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv85q\" (UniqueName: \"kubernetes.io/projected/ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa-kube-api-access-sv85q\") pod \"keda-admission-cf49989db-xgntr\" (UID: \"ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa\") " pod="openshift-keda/keda-admission-cf49989db-xgntr" Apr 20 20:11:25.985988 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:25.985879 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa-certificates\") pod \"keda-admission-cf49989db-xgntr\" (UID: \"ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa\") " pod="openshift-keda/keda-admission-cf49989db-xgntr" Apr 20 20:11:26.086528 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:26.086492 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sv85q\" (UniqueName: \"kubernetes.io/projected/ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa-kube-api-access-sv85q\") pod \"keda-admission-cf49989db-xgntr\" (UID: \"ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa\") " pod="openshift-keda/keda-admission-cf49989db-xgntr" Apr 20 20:11:26.086528 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:26.086532 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa-certificates\") pod \"keda-admission-cf49989db-xgntr\" (UID: \"ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa\") " pod="openshift-keda/keda-admission-cf49989db-xgntr" Apr 20 20:11:26.086834 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:26.086647 2573 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 20 20:11:26.086834 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:26.086671 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-xgntr: secret "keda-admission-webhooks-certs" not found Apr 20 20:11:26.086834 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:26.086730 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa-certificates podName:ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa nodeName:}" failed. No retries permitted until 2026-04-20 20:11:26.586711836 +0000 UTC m=+330.136578430 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa-certificates") pod "keda-admission-cf49989db-xgntr" (UID: "ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa") : secret "keda-admission-webhooks-certs" not found Apr 20 20:11:26.098509 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:26.098475 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv85q\" (UniqueName: \"kubernetes.io/projected/ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa-kube-api-access-sv85q\") pod \"keda-admission-cf49989db-xgntr\" (UID: \"ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa\") " pod="openshift-keda/keda-admission-cf49989db-xgntr" Apr 20 20:11:26.388231 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:26.388186 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b26e915f-c347-4d6b-b992-4b1ec7d13d40-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vvvmw\" (UID: \"b26e915f-c347-4d6b-b992-4b1ec7d13d40\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" Apr 20 20:11:26.388435 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:26.388368 2573 secret.go:281] references non-existent secret key: tls.crt Apr 20 20:11:26.388435 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:26.388391 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 20 20:11:26.388435 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:26.388411 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw: references non-existent secret key: tls.crt Apr 20 20:11:26.388536 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:26.388469 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b26e915f-c347-4d6b-b992-4b1ec7d13d40-certificates podName:b26e915f-c347-4d6b-b992-4b1ec7d13d40 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:27.388451447 +0000 UTC m=+330.938318041 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b26e915f-c347-4d6b-b992-4b1ec7d13d40-certificates") pod "keda-metrics-apiserver-7c9f485588-vvvmw" (UID: "b26e915f-c347-4d6b-b992-4b1ec7d13d40") : references non-existent secret key: tls.crt Apr 20 20:11:26.589623 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:26.589584 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa-certificates\") pod \"keda-admission-cf49989db-xgntr\" (UID: \"ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa\") " pod="openshift-keda/keda-admission-cf49989db-xgntr" Apr 20 20:11:26.592189 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:26.592158 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa-certificates\") pod \"keda-admission-cf49989db-xgntr\" (UID: \"ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa\") " pod="openshift-keda/keda-admission-cf49989db-xgntr" Apr 20 20:11:26.876680 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:26.876638 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-xgntr" Apr 20 20:11:27.006623 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:27.006590 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-xgntr"] Apr 20 20:11:27.010176 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:11:27.010139 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podede0ca05_e0ab_404e_86ac_85e0a0e4a9fa.slice/crio-773164ad846e3ea9f29ce81de13001b6c5957abdd6bcf8826b8e3c4f8cea44fa WatchSource:0}: Error finding container 773164ad846e3ea9f29ce81de13001b6c5957abdd6bcf8826b8e3c4f8cea44fa: Status 404 returned error can't find the container with id 773164ad846e3ea9f29ce81de13001b6c5957abdd6bcf8826b8e3c4f8cea44fa Apr 20 20:11:27.011453 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:27.011435 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:11:27.395789 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:27.395749 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b26e915f-c347-4d6b-b992-4b1ec7d13d40-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vvvmw\" (UID: \"b26e915f-c347-4d6b-b992-4b1ec7d13d40\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" Apr 20 20:11:27.395966 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:27.395890 2573 secret.go:281] references non-existent secret key: tls.crt Apr 20 20:11:27.395966 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:27.395908 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 20 20:11:27.395966 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:27.395925 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw: references non-existent secret key: tls.crt Apr 20 20:11:27.396075 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:27.395991 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b26e915f-c347-4d6b-b992-4b1ec7d13d40-certificates podName:b26e915f-c347-4d6b-b992-4b1ec7d13d40 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:29.39597385 +0000 UTC m=+332.945840443 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b26e915f-c347-4d6b-b992-4b1ec7d13d40-certificates") pod "keda-metrics-apiserver-7c9f485588-vvvmw" (UID: "b26e915f-c347-4d6b-b992-4b1ec7d13d40") : references non-existent secret key: tls.crt Apr 20 20:11:27.921808 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:27.921767 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-xgntr" event={"ID":"ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa","Type":"ContainerStarted","Data":"773164ad846e3ea9f29ce81de13001b6c5957abdd6bcf8826b8e3c4f8cea44fa"} Apr 20 20:11:29.412759 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:29.412717 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b26e915f-c347-4d6b-b992-4b1ec7d13d40-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vvvmw\" (UID: \"b26e915f-c347-4d6b-b992-4b1ec7d13d40\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" Apr 20 20:11:29.413229 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:29.412907 2573 secret.go:281] references non-existent secret key: tls.crt Apr 20 20:11:29.413229 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:29.412939 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 20 20:11:29.413229 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:29.412964 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw: references non-existent secret key: tls.crt Apr 20 20:11:29.413229 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:11:29.413035 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b26e915f-c347-4d6b-b992-4b1ec7d13d40-certificates podName:b26e915f-c347-4d6b-b992-4b1ec7d13d40 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:33.41301425 +0000 UTC m=+336.962880856 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b26e915f-c347-4d6b-b992-4b1ec7d13d40-certificates") pod "keda-metrics-apiserver-7c9f485588-vvvmw" (UID: "b26e915f-c347-4d6b-b992-4b1ec7d13d40") : references non-existent secret key: tls.crt Apr 20 20:11:29.928223 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:29.928186 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-xgntr" event={"ID":"ede0ca05-e0ab-404e-86ac-85e0a0e4a9fa","Type":"ContainerStarted","Data":"43ced46fd5bf8e33eeb9e8f1843ca6d012777cbf622dbf6742ef80dc210865b4"} Apr 20 20:11:29.928410 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:29.928344 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-xgntr" Apr 20 20:11:29.945551 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:29.945415 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-xgntr" podStartSLOduration=2.6703784109999997 podStartE2EDuration="4.945398912s" podCreationTimestamp="2026-04-20 20:11:25 +0000 UTC" firstStartedPulling="2026-04-20 20:11:27.011570417 +0000 UTC m=+330.561437009" lastFinishedPulling="2026-04-20 20:11:29.286590917 +0000 UTC m=+332.836457510" observedRunningTime="2026-04-20 20:11:29.944149062 +0000 UTC m=+333.494015677" watchObservedRunningTime="2026-04-20 20:11:29.945398912 +0000 UTC m=+333.495265525" Apr 20 20:11:33.442622 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:33.442571 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b26e915f-c347-4d6b-b992-4b1ec7d13d40-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vvvmw\" (UID: \"b26e915f-c347-4d6b-b992-4b1ec7d13d40\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" Apr 20 20:11:33.445377 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:33.445350 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b26e915f-c347-4d6b-b992-4b1ec7d13d40-certificates\") pod \"keda-metrics-apiserver-7c9f485588-vvvmw\" (UID: \"b26e915f-c347-4d6b-b992-4b1ec7d13d40\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" Apr 20 20:11:33.489029 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:33.488991 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" Apr 20 20:11:33.610635 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:33.610600 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw"] Apr 20 20:11:33.613983 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:11:33.613953 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb26e915f_c347_4d6b_b992_4b1ec7d13d40.slice/crio-778d16f0e316c6d62a02dd677efabd5267f5a2232238d45cc126ee3ced700acf WatchSource:0}: Error finding container 778d16f0e316c6d62a02dd677efabd5267f5a2232238d45cc126ee3ced700acf: Status 404 returned error can't find the container with id 778d16f0e316c6d62a02dd677efabd5267f5a2232238d45cc126ee3ced700acf Apr 20 20:11:33.939066 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:33.939028 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" event={"ID":"b26e915f-c347-4d6b-b992-4b1ec7d13d40","Type":"ContainerStarted","Data":"778d16f0e316c6d62a02dd677efabd5267f5a2232238d45cc126ee3ced700acf"} Apr 20 20:11:36.948437 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:36.948397 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" event={"ID":"b26e915f-c347-4d6b-b992-4b1ec7d13d40","Type":"ContainerStarted","Data":"abd8d0326a4acc0ac10460848c1e8c7adc776467b327131416d3708bb563638e"} Apr 20 20:11:36.948916 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:36.948530 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" Apr 20 20:11:36.965587 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:36.965536 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" podStartSLOduration=9.129011025 podStartE2EDuration="11.965519118s" podCreationTimestamp="2026-04-20 20:11:25 +0000 UTC" firstStartedPulling="2026-04-20 20:11:33.617936629 +0000 UTC m=+337.167803223" lastFinishedPulling="2026-04-20 20:11:36.454444712 +0000 UTC m=+340.004311316" observedRunningTime="2026-04-20 20:11:36.964679135 +0000 UTC m=+340.514545750" watchObservedRunningTime="2026-04-20 20:11:36.965519118 +0000 UTC m=+340.515385771" Apr 20 20:11:47.956008 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:47.955973 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-vvvmw" Apr 20 20:11:50.934132 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:11:50.934101 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-xgntr" Apr 20 20:13:43.949189 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:43.949134 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-r898s"] Apr 20 20:13:43.951123 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:43.951105 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-r898s" Apr 20 20:13:43.953739 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:43.953717 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-psdp7\"" Apr 20 20:13:43.954177 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:43.954161 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 20 20:13:43.954274 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:43.954199 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 20 20:13:43.954459 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:43.954442 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 20 20:13:43.963830 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:43.963809 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-r898s"] Apr 20 20:13:44.087727 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:44.087682 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e939f77-2e54-4923-9373-24d56cc11538-cert\") pod \"odh-model-controller-696fc77849-r898s\" (UID: \"9e939f77-2e54-4923-9373-24d56cc11538\") " pod="kserve/odh-model-controller-696fc77849-r898s" Apr 20 20:13:44.088202 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:44.088165 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhgvf\" (UniqueName: \"kubernetes.io/projected/9e939f77-2e54-4923-9373-24d56cc11538-kube-api-access-xhgvf\") pod \"odh-model-controller-696fc77849-r898s\" (UID: \"9e939f77-2e54-4923-9373-24d56cc11538\") " pod="kserve/odh-model-controller-696fc77849-r898s" Apr 20 20:13:44.188926 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:44.188868 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhgvf\" (UniqueName: \"kubernetes.io/projected/9e939f77-2e54-4923-9373-24d56cc11538-kube-api-access-xhgvf\") pod \"odh-model-controller-696fc77849-r898s\" (UID: \"9e939f77-2e54-4923-9373-24d56cc11538\") " pod="kserve/odh-model-controller-696fc77849-r898s" Apr 20 20:13:44.189137 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:44.188952 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e939f77-2e54-4923-9373-24d56cc11538-cert\") pod \"odh-model-controller-696fc77849-r898s\" (UID: \"9e939f77-2e54-4923-9373-24d56cc11538\") " pod="kserve/odh-model-controller-696fc77849-r898s" Apr 20 20:13:44.191593 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:44.191569 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e939f77-2e54-4923-9373-24d56cc11538-cert\") pod \"odh-model-controller-696fc77849-r898s\" (UID: \"9e939f77-2e54-4923-9373-24d56cc11538\") " pod="kserve/odh-model-controller-696fc77849-r898s" Apr 20 20:13:44.197882 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:44.197855 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhgvf\" (UniqueName: \"kubernetes.io/projected/9e939f77-2e54-4923-9373-24d56cc11538-kube-api-access-xhgvf\") pod \"odh-model-controller-696fc77849-r898s\" (UID: \"9e939f77-2e54-4923-9373-24d56cc11538\") " pod="kserve/odh-model-controller-696fc77849-r898s" Apr 20 20:13:44.260960 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:44.260868 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-r898s" Apr 20 20:13:44.382710 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:44.382676 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-r898s"] Apr 20 20:13:44.385991 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:13:44.385962 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e939f77_2e54_4923_9373_24d56cc11538.slice/crio-0e24238fbac4574200435f31ed4145e03e99613785107009efd0376c5c9e27d3 WatchSource:0}: Error finding container 0e24238fbac4574200435f31ed4145e03e99613785107009efd0376c5c9e27d3: Status 404 returned error can't find the container with id 0e24238fbac4574200435f31ed4145e03e99613785107009efd0376c5c9e27d3 Apr 20 20:13:45.285589 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:45.285547 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-r898s" event={"ID":"9e939f77-2e54-4923-9373-24d56cc11538","Type":"ContainerStarted","Data":"0e24238fbac4574200435f31ed4145e03e99613785107009efd0376c5c9e27d3"} Apr 20 20:13:48.296784 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:48.296742 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-r898s" event={"ID":"9e939f77-2e54-4923-9373-24d56cc11538","Type":"ContainerStarted","Data":"71a3f2590002063c6730343d385e639a630f90953db06e999275ecc430b6852c"} Apr 20 20:13:48.297178 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:48.296928 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-r898s" Apr 20 20:13:48.312453 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:48.312362 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-r898s" podStartSLOduration=2.435650813 podStartE2EDuration="5.312344325s" podCreationTimestamp="2026-04-20 20:13:43 +0000 UTC" firstStartedPulling="2026-04-20 20:13:44.387196769 +0000 UTC m=+467.937063361" lastFinishedPulling="2026-04-20 20:13:47.263890281 +0000 UTC m=+470.813756873" observedRunningTime="2026-04-20 20:13:48.312091173 +0000 UTC m=+471.861957801" watchObservedRunningTime="2026-04-20 20:13:48.312344325 +0000 UTC m=+471.862210940" Apr 20 20:13:59.301693 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:13:59.301651 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-r898s" Apr 20 20:15:56.878577 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:15:56.878542 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/ovn-acl-logging/0.log" Apr 20 20:15:56.879098 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:15:56.878908 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/ovn-acl-logging/0.log" Apr 20 20:17:45.655035 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:45.654942 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms"] Apr 20 20:17:45.657280 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:45.657257 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" Apr 20 20:17:45.659613 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:45.659589 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lk7n4\"" Apr 20 20:17:45.659613 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:45.659608 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-87d08-serving-cert\"" Apr 20 20:17:45.659757 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:45.659609 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-87d08-kube-rbac-proxy-sar-config\"" Apr 20 20:17:45.660441 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:45.660427 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 20:17:45.667174 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:45.667152 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms"] Apr 20 20:17:45.815053 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:45.815014 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93abaa13-1671-47af-8ad3-fa90ddd8fe04-openshift-service-ca-bundle\") pod \"model-chainer-raw-87d08-7cf6cc6959-nllms\" (UID: \"93abaa13-1671-47af-8ad3-fa90ddd8fe04\") " pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" Apr 20 20:17:45.815245 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:45.815069 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93abaa13-1671-47af-8ad3-fa90ddd8fe04-proxy-tls\") pod \"model-chainer-raw-87d08-7cf6cc6959-nllms\" (UID: \"93abaa13-1671-47af-8ad3-fa90ddd8fe04\") " pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" Apr 20 20:17:45.915747 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:45.915645 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93abaa13-1671-47af-8ad3-fa90ddd8fe04-proxy-tls\") pod \"model-chainer-raw-87d08-7cf6cc6959-nllms\" (UID: \"93abaa13-1671-47af-8ad3-fa90ddd8fe04\") " pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" Apr 20 20:17:45.915747 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:45.915706 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93abaa13-1671-47af-8ad3-fa90ddd8fe04-openshift-service-ca-bundle\") pod \"model-chainer-raw-87d08-7cf6cc6959-nllms\" (UID: \"93abaa13-1671-47af-8ad3-fa90ddd8fe04\") " pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" Apr 20 20:17:45.916342 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:45.916318 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93abaa13-1671-47af-8ad3-fa90ddd8fe04-openshift-service-ca-bundle\") pod \"model-chainer-raw-87d08-7cf6cc6959-nllms\" (UID: \"93abaa13-1671-47af-8ad3-fa90ddd8fe04\") " pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" Apr 20 20:17:45.918233 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:45.918213 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93abaa13-1671-47af-8ad3-fa90ddd8fe04-proxy-tls\") pod \"model-chainer-raw-87d08-7cf6cc6959-nllms\" (UID: \"93abaa13-1671-47af-8ad3-fa90ddd8fe04\") " pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" Apr 20 20:17:45.967960 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:45.967924 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" Apr 20 20:17:46.087714 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:46.087672 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms"] Apr 20 20:17:46.091759 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:17:46.091730 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93abaa13_1671_47af_8ad3_fa90ddd8fe04.slice/crio-2566e4a78f652b09e73a58648ab73928feb6055b05dc509f08563af6f8b26b9c WatchSource:0}: Error finding container 2566e4a78f652b09e73a58648ab73928feb6055b05dc509f08563af6f8b26b9c: Status 404 returned error can't find the container with id 2566e4a78f652b09e73a58648ab73928feb6055b05dc509f08563af6f8b26b9c Apr 20 20:17:46.093421 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:46.093401 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:17:46.932467 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:46.932433 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" event={"ID":"93abaa13-1671-47af-8ad3-fa90ddd8fe04","Type":"ContainerStarted","Data":"2566e4a78f652b09e73a58648ab73928feb6055b05dc509f08563af6f8b26b9c"} Apr 20 20:17:48.938508 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:48.938463 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" event={"ID":"93abaa13-1671-47af-8ad3-fa90ddd8fe04","Type":"ContainerStarted","Data":"3dda93d822cee4b87d6cdb52cda3b66cdbfd26d94b312b28e93678c0e5dd538e"} Apr 20 20:17:48.938887 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:48.938582 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" Apr 20 20:17:48.954817 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:48.954767 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" podStartSLOduration=1.480660877 podStartE2EDuration="3.954753385s" podCreationTimestamp="2026-04-20 20:17:45 +0000 UTC" firstStartedPulling="2026-04-20 20:17:46.093531918 +0000 UTC m=+709.643398509" lastFinishedPulling="2026-04-20 20:17:48.567624421 +0000 UTC m=+712.117491017" observedRunningTime="2026-04-20 20:17:48.953536249 +0000 UTC m=+712.503402863" watchObservedRunningTime="2026-04-20 20:17:48.954753385 +0000 UTC m=+712.504619999" Apr 20 20:17:54.948752 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:54.948722 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" Apr 20 20:17:55.715589 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:55.715557 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms"] Apr 20 20:17:55.715808 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:55.715763 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" podUID="93abaa13-1671-47af-8ad3-fa90ddd8fe04" containerName="model-chainer-raw-87d08" containerID="cri-o://3dda93d822cee4b87d6cdb52cda3b66cdbfd26d94b312b28e93678c0e5dd538e" gracePeriod=30 Apr 20 20:17:59.945695 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:17:59.945653 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" podUID="93abaa13-1671-47af-8ad3-fa90ddd8fe04" containerName="model-chainer-raw-87d08" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:18:04.946356 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:04.946286 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" podUID="93abaa13-1671-47af-8ad3-fa90ddd8fe04" containerName="model-chainer-raw-87d08" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:18:09.945947 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:09.945899 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" podUID="93abaa13-1671-47af-8ad3-fa90ddd8fe04" containerName="model-chainer-raw-87d08" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:18:09.946396 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:09.946004 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" Apr 20 20:18:14.946234 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:14.946186 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" podUID="93abaa13-1671-47af-8ad3-fa90ddd8fe04" containerName="model-chainer-raw-87d08" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:18:19.946078 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:19.946040 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" podUID="93abaa13-1671-47af-8ad3-fa90ddd8fe04" containerName="model-chainer-raw-87d08" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:18:24.946203 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:24.946160 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" podUID="93abaa13-1671-47af-8ad3-fa90ddd8fe04" containerName="model-chainer-raw-87d08" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:18:25.735803 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:18:25.735761 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93abaa13_1671_47af_8ad3_fa90ddd8fe04.slice/crio-3dda93d822cee4b87d6cdb52cda3b66cdbfd26d94b312b28e93678c0e5dd538e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93abaa13_1671_47af_8ad3_fa90ddd8fe04.slice/crio-conmon-3dda93d822cee4b87d6cdb52cda3b66cdbfd26d94b312b28e93678c0e5dd538e.scope\": RecentStats: unable to find data in memory cache]" Apr 20 20:18:25.735915 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:18:25.735818 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93abaa13_1671_47af_8ad3_fa90ddd8fe04.slice/crio-conmon-3dda93d822cee4b87d6cdb52cda3b66cdbfd26d94b312b28e93678c0e5dd538e.scope\": RecentStats: unable to find data in memory cache]" Apr 20 20:18:25.735915 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:18:25.735890 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93abaa13_1671_47af_8ad3_fa90ddd8fe04.slice/crio-conmon-3dda93d822cee4b87d6cdb52cda3b66cdbfd26d94b312b28e93678c0e5dd538e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93abaa13_1671_47af_8ad3_fa90ddd8fe04.slice/crio-3dda93d822cee4b87d6cdb52cda3b66cdbfd26d94b312b28e93678c0e5dd538e.scope\": RecentStats: unable to find data in memory cache]" Apr 20 20:18:26.035446 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:26.035351 2573 generic.go:358] "Generic (PLEG): container finished" podID="93abaa13-1671-47af-8ad3-fa90ddd8fe04" containerID="3dda93d822cee4b87d6cdb52cda3b66cdbfd26d94b312b28e93678c0e5dd538e" exitCode=0 Apr 20 20:18:26.035446 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:26.035410 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" event={"ID":"93abaa13-1671-47af-8ad3-fa90ddd8fe04","Type":"ContainerDied","Data":"3dda93d822cee4b87d6cdb52cda3b66cdbfd26d94b312b28e93678c0e5dd538e"} Apr 20 20:18:26.362329 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:26.362283 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" Apr 20 20:18:26.416332 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:26.416279 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93abaa13-1671-47af-8ad3-fa90ddd8fe04-openshift-service-ca-bundle\") pod \"93abaa13-1671-47af-8ad3-fa90ddd8fe04\" (UID: \"93abaa13-1671-47af-8ad3-fa90ddd8fe04\") " Apr 20 20:18:26.416522 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:26.416371 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93abaa13-1671-47af-8ad3-fa90ddd8fe04-proxy-tls\") pod \"93abaa13-1671-47af-8ad3-fa90ddd8fe04\" (UID: \"93abaa13-1671-47af-8ad3-fa90ddd8fe04\") " Apr 20 20:18:26.416614 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:26.416589 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93abaa13-1671-47af-8ad3-fa90ddd8fe04-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "93abaa13-1671-47af-8ad3-fa90ddd8fe04" (UID: "93abaa13-1671-47af-8ad3-fa90ddd8fe04"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:18:26.418623 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:26.418599 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93abaa13-1671-47af-8ad3-fa90ddd8fe04-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "93abaa13-1671-47af-8ad3-fa90ddd8fe04" (UID: "93abaa13-1671-47af-8ad3-fa90ddd8fe04"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:18:26.517235 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:26.517192 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93abaa13-1671-47af-8ad3-fa90ddd8fe04-openshift-service-ca-bundle\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 20 20:18:26.517235 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:26.517228 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93abaa13-1671-47af-8ad3-fa90ddd8fe04-proxy-tls\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 20 20:18:27.038656 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:27.038615 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" event={"ID":"93abaa13-1671-47af-8ad3-fa90ddd8fe04","Type":"ContainerDied","Data":"2566e4a78f652b09e73a58648ab73928feb6055b05dc509f08563af6f8b26b9c"} Apr 20 20:18:27.038656 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:27.038659 2573 scope.go:117] "RemoveContainer" containerID="3dda93d822cee4b87d6cdb52cda3b66cdbfd26d94b312b28e93678c0e5dd538e" Apr 20 20:18:27.039177 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:27.038706 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms" Apr 20 20:18:27.055398 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:27.055366 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms"] Apr 20 20:18:27.058565 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:27.058542 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-87d08-7cf6cc6959-nllms"] Apr 20 20:18:28.977104 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:18:28.977072 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93abaa13-1671-47af-8ad3-fa90ddd8fe04" path="/var/lib/kubelet/pods/93abaa13-1671-47af-8ad3-fa90ddd8fe04/volumes" Apr 20 20:19:25.972514 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:25.972424 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f"] Apr 20 20:19:25.973078 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:25.972788 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93abaa13-1671-47af-8ad3-fa90ddd8fe04" containerName="model-chainer-raw-87d08" Apr 20 20:19:25.973078 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:25.972805 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="93abaa13-1671-47af-8ad3-fa90ddd8fe04" containerName="model-chainer-raw-87d08" Apr 20 20:19:25.973078 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:25.972868 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="93abaa13-1671-47af-8ad3-fa90ddd8fe04" containerName="model-chainer-raw-87d08" Apr 20 20:19:25.975702 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:25.975680 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" Apr 20 20:19:25.978065 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:25.978042 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lk7n4\"" Apr 20 20:19:25.978288 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:25.978271 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-3bab1-serving-cert\"" Apr 20 20:19:25.978380 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:25.978272 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-3bab1-kube-rbac-proxy-sar-config\"" Apr 20 20:19:25.978876 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:25.978847 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 20:19:25.983221 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:25.983202 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f"] Apr 20 20:19:26.066511 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:26.066471 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d90a3502-774b-42a3-9d44-d0c19127b044-proxy-tls\") pod \"model-chainer-raw-hpa-3bab1-558467f979-8cb6f\" (UID: \"d90a3502-774b-42a3-9d44-d0c19127b044\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" Apr 20 20:19:26.066693 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:26.066530 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d90a3502-774b-42a3-9d44-d0c19127b044-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-3bab1-558467f979-8cb6f\" (UID: \"d90a3502-774b-42a3-9d44-d0c19127b044\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" Apr 20 20:19:26.167844 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:26.167803 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d90a3502-774b-42a3-9d44-d0c19127b044-proxy-tls\") pod \"model-chainer-raw-hpa-3bab1-558467f979-8cb6f\" (UID: \"d90a3502-774b-42a3-9d44-d0c19127b044\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" Apr 20 20:19:26.168026 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:26.167856 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d90a3502-774b-42a3-9d44-d0c19127b044-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-3bab1-558467f979-8cb6f\" (UID: \"d90a3502-774b-42a3-9d44-d0c19127b044\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" Apr 20 20:19:26.168026 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:19:26.167964 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-serving-cert: secret "model-chainer-raw-hpa-3bab1-serving-cert" not found Apr 20 20:19:26.168097 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:19:26.168050 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d90a3502-774b-42a3-9d44-d0c19127b044-proxy-tls podName:d90a3502-774b-42a3-9d44-d0c19127b044 nodeName:}" failed. No retries permitted until 2026-04-20 20:19:26.66802743 +0000 UTC m=+810.217894042 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d90a3502-774b-42a3-9d44-d0c19127b044-proxy-tls") pod "model-chainer-raw-hpa-3bab1-558467f979-8cb6f" (UID: "d90a3502-774b-42a3-9d44-d0c19127b044") : secret "model-chainer-raw-hpa-3bab1-serving-cert" not found Apr 20 20:19:26.168490 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:26.168473 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d90a3502-774b-42a3-9d44-d0c19127b044-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-3bab1-558467f979-8cb6f\" (UID: \"d90a3502-774b-42a3-9d44-d0c19127b044\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" Apr 20 20:19:26.672126 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:26.672078 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d90a3502-774b-42a3-9d44-d0c19127b044-proxy-tls\") pod \"model-chainer-raw-hpa-3bab1-558467f979-8cb6f\" (UID: \"d90a3502-774b-42a3-9d44-d0c19127b044\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" Apr 20 20:19:26.674714 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:26.674683 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d90a3502-774b-42a3-9d44-d0c19127b044-proxy-tls\") pod \"model-chainer-raw-hpa-3bab1-558467f979-8cb6f\" (UID: \"d90a3502-774b-42a3-9d44-d0c19127b044\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" Apr 20 20:19:26.887057 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:26.887009 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" Apr 20 20:19:27.017378 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:27.017349 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f"] Apr 20 20:19:27.019952 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:19:27.019924 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd90a3502_774b_42a3_9d44_d0c19127b044.slice/crio-8ce369a556eebb43913ae1cd0fe91c47e2c34a7c90ca7c173d9562d499a2ffbf WatchSource:0}: Error finding container 8ce369a556eebb43913ae1cd0fe91c47e2c34a7c90ca7c173d9562d499a2ffbf: Status 404 returned error can't find the container with id 8ce369a556eebb43913ae1cd0fe91c47e2c34a7c90ca7c173d9562d499a2ffbf Apr 20 20:19:27.203201 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:27.203104 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" event={"ID":"d90a3502-774b-42a3-9d44-d0c19127b044","Type":"ContainerStarted","Data":"a7f43634ac92f473c1cc675d0104287e5b92652d7750838d920535d6021ce902"} Apr 20 20:19:27.203201 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:27.203149 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" event={"ID":"d90a3502-774b-42a3-9d44-d0c19127b044","Type":"ContainerStarted","Data":"8ce369a556eebb43913ae1cd0fe91c47e2c34a7c90ca7c173d9562d499a2ffbf"} Apr 20 20:19:27.203422 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:27.203231 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" Apr 20 20:19:27.223508 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:27.223455 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" podStartSLOduration=2.223437579 podStartE2EDuration="2.223437579s" podCreationTimestamp="2026-04-20 20:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:19:27.22294355 +0000 UTC m=+810.772810175" watchObservedRunningTime="2026-04-20 20:19:27.223437579 +0000 UTC m=+810.773304194" Apr 20 20:19:33.211767 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:33.211737 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" Apr 20 20:19:36.013713 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:36.013673 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f"] Apr 20 20:19:36.014204 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:36.013998 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" podUID="d90a3502-774b-42a3-9d44-d0c19127b044" containerName="model-chainer-raw-hpa-3bab1" containerID="cri-o://a7f43634ac92f473c1cc675d0104287e5b92652d7750838d920535d6021ce902" gracePeriod=30 Apr 20 20:19:38.210368 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:38.210324 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" podUID="d90a3502-774b-42a3-9d44-d0c19127b044" containerName="model-chainer-raw-hpa-3bab1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:19:43.210700 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:43.210657 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" podUID="d90a3502-774b-42a3-9d44-d0c19127b044" containerName="model-chainer-raw-hpa-3bab1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:19:48.210360 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:48.210318 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" podUID="d90a3502-774b-42a3-9d44-d0c19127b044" containerName="model-chainer-raw-hpa-3bab1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:19:48.210785 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:48.210443 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" Apr 20 20:19:53.210356 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:53.210285 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" podUID="d90a3502-774b-42a3-9d44-d0c19127b044" containerName="model-chainer-raw-hpa-3bab1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:19:58.209640 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:19:58.209600 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" podUID="d90a3502-774b-42a3-9d44-d0c19127b044" containerName="model-chainer-raw-hpa-3bab1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:20:03.211255 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:03.211204 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" podUID="d90a3502-774b-42a3-9d44-d0c19127b044" containerName="model-chainer-raw-hpa-3bab1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:20:06.038105 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:20:06.038074 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd90a3502_774b_42a3_9d44_d0c19127b044.slice/crio-conmon-a7f43634ac92f473c1cc675d0104287e5b92652d7750838d920535d6021ce902.scope\": RecentStats: unable to find data in memory cache]" Apr 20 20:20:06.038478 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:20:06.038232 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd90a3502_774b_42a3_9d44_d0c19127b044.slice/crio-conmon-a7f43634ac92f473c1cc675d0104287e5b92652d7750838d920535d6021ce902.scope\": RecentStats: unable to find data in memory cache]" Apr 20 20:20:06.144755 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:06.144728 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" Apr 20 20:20:06.262364 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:06.262318 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d90a3502-774b-42a3-9d44-d0c19127b044-proxy-tls\") pod \"d90a3502-774b-42a3-9d44-d0c19127b044\" (UID: \"d90a3502-774b-42a3-9d44-d0c19127b044\") " Apr 20 20:20:06.262364 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:06.262366 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d90a3502-774b-42a3-9d44-d0c19127b044-openshift-service-ca-bundle\") pod \"d90a3502-774b-42a3-9d44-d0c19127b044\" (UID: \"d90a3502-774b-42a3-9d44-d0c19127b044\") " Apr 20 20:20:06.262755 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:06.262725 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d90a3502-774b-42a3-9d44-d0c19127b044-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d90a3502-774b-42a3-9d44-d0c19127b044" (UID: "d90a3502-774b-42a3-9d44-d0c19127b044"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:20:06.264578 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:06.264551 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d90a3502-774b-42a3-9d44-d0c19127b044-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d90a3502-774b-42a3-9d44-d0c19127b044" (UID: "d90a3502-774b-42a3-9d44-d0c19127b044"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:20:06.314972 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:06.314936 2573 generic.go:358] "Generic (PLEG): container finished" podID="d90a3502-774b-42a3-9d44-d0c19127b044" containerID="a7f43634ac92f473c1cc675d0104287e5b92652d7750838d920535d6021ce902" exitCode=0 Apr 20 20:20:06.315150 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:06.315012 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" Apr 20 20:20:06.315150 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:06.315019 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" event={"ID":"d90a3502-774b-42a3-9d44-d0c19127b044","Type":"ContainerDied","Data":"a7f43634ac92f473c1cc675d0104287e5b92652d7750838d920535d6021ce902"} Apr 20 20:20:06.315150 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:06.315056 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f" event={"ID":"d90a3502-774b-42a3-9d44-d0c19127b044","Type":"ContainerDied","Data":"8ce369a556eebb43913ae1cd0fe91c47e2c34a7c90ca7c173d9562d499a2ffbf"} Apr 20 20:20:06.315150 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:06.315071 2573 scope.go:117] "RemoveContainer" containerID="a7f43634ac92f473c1cc675d0104287e5b92652d7750838d920535d6021ce902" Apr 20 20:20:06.323114 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:06.323093 2573 scope.go:117] "RemoveContainer" containerID="a7f43634ac92f473c1cc675d0104287e5b92652d7750838d920535d6021ce902" Apr 20 20:20:06.323413 ip-10-0-136-158 kubenswrapper[2573]: E0420 20:20:06.323395 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7f43634ac92f473c1cc675d0104287e5b92652d7750838d920535d6021ce902\": container with ID starting with a7f43634ac92f473c1cc675d0104287e5b92652d7750838d920535d6021ce902 not found: ID does not exist" containerID="a7f43634ac92f473c1cc675d0104287e5b92652d7750838d920535d6021ce902" Apr 20 20:20:06.323475 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:06.323423 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7f43634ac92f473c1cc675d0104287e5b92652d7750838d920535d6021ce902"} err="failed to get container status \"a7f43634ac92f473c1cc675d0104287e5b92652d7750838d920535d6021ce902\": rpc error: code = NotFound desc = could not find container \"a7f43634ac92f473c1cc675d0104287e5b92652d7750838d920535d6021ce902\": container with ID starting with a7f43634ac92f473c1cc675d0104287e5b92652d7750838d920535d6021ce902 not found: ID does not exist" Apr 20 20:20:06.333587 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:06.333565 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f"] Apr 20 20:20:06.336760 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:06.336733 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3bab1-558467f979-8cb6f"] Apr 20 20:20:06.362866 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:06.362828 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d90a3502-774b-42a3-9d44-d0c19127b044-proxy-tls\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 20 20:20:06.362866 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:06.362856 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d90a3502-774b-42a3-9d44-d0c19127b044-openshift-service-ca-bundle\") on node \"ip-10-0-136-158.ec2.internal\" DevicePath \"\"" Apr 20 20:20:06.977083 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:06.977049 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d90a3502-774b-42a3-9d44-d0c19127b044" path="/var/lib/kubelet/pods/d90a3502-774b-42a3-9d44-d0c19127b044/volumes" Apr 20 20:20:56.894531 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:56.894452 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/ovn-acl-logging/0.log" Apr 20 20:20:56.896810 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:20:56.896792 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/ovn-acl-logging/0.log" Apr 20 20:25:56.912613 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:25:56.912584 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/ovn-acl-logging/0.log" Apr 20 20:25:56.914597 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:25:56.914575 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/ovn-acl-logging/0.log" Apr 20 20:28:28.431632 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:28.431547 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kxgfw/must-gather-wddl9"] Apr 20 20:28:28.432059 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:28.431871 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d90a3502-774b-42a3-9d44-d0c19127b044" containerName="model-chainer-raw-hpa-3bab1" Apr 20 20:28:28.432059 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:28.431886 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d90a3502-774b-42a3-9d44-d0c19127b044" containerName="model-chainer-raw-hpa-3bab1" Apr 20 20:28:28.432059 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:28.431926 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d90a3502-774b-42a3-9d44-d0c19127b044" containerName="model-chainer-raw-hpa-3bab1" Apr 20 20:28:28.434797 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:28.434778 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxgfw/must-gather-wddl9" Apr 20 20:28:28.437215 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:28.437180 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kxgfw\"/\"default-dockercfg-f8tg6\"" Apr 20 20:28:28.437387 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:28.437217 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kxgfw\"/\"openshift-service-ca.crt\"" Apr 20 20:28:28.437988 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:28.437973 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kxgfw\"/\"kube-root-ca.crt\"" Apr 20 20:28:28.446450 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:28.446415 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kxgfw/must-gather-wddl9"] Apr 20 20:28:28.576863 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:28.576817 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/698f369e-5ccd-4915-b909-8f26dcabb1e2-must-gather-output\") pod \"must-gather-wddl9\" (UID: \"698f369e-5ccd-4915-b909-8f26dcabb1e2\") " pod="openshift-must-gather-kxgfw/must-gather-wddl9" Apr 20 20:28:28.576863 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:28.576868 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmqtg\" (UniqueName: \"kubernetes.io/projected/698f369e-5ccd-4915-b909-8f26dcabb1e2-kube-api-access-gmqtg\") pod \"must-gather-wddl9\" (UID: \"698f369e-5ccd-4915-b909-8f26dcabb1e2\") " pod="openshift-must-gather-kxgfw/must-gather-wddl9" Apr 20 20:28:28.677716 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:28.677673 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/698f369e-5ccd-4915-b909-8f26dcabb1e2-must-gather-output\") pod \"must-gather-wddl9\" (UID: \"698f369e-5ccd-4915-b909-8f26dcabb1e2\") " pod="openshift-must-gather-kxgfw/must-gather-wddl9" Apr 20 20:28:28.677716 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:28.677721 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmqtg\" (UniqueName: \"kubernetes.io/projected/698f369e-5ccd-4915-b909-8f26dcabb1e2-kube-api-access-gmqtg\") pod \"must-gather-wddl9\" (UID: \"698f369e-5ccd-4915-b909-8f26dcabb1e2\") " pod="openshift-must-gather-kxgfw/must-gather-wddl9" Apr 20 20:28:28.678107 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:28.678075 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/698f369e-5ccd-4915-b909-8f26dcabb1e2-must-gather-output\") pod \"must-gather-wddl9\" (UID: \"698f369e-5ccd-4915-b909-8f26dcabb1e2\") " pod="openshift-must-gather-kxgfw/must-gather-wddl9" Apr 20 20:28:28.686157 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:28.686080 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmqtg\" (UniqueName: \"kubernetes.io/projected/698f369e-5ccd-4915-b909-8f26dcabb1e2-kube-api-access-gmqtg\") pod \"must-gather-wddl9\" (UID: \"698f369e-5ccd-4915-b909-8f26dcabb1e2\") " pod="openshift-must-gather-kxgfw/must-gather-wddl9" Apr 20 20:28:28.743387 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:28.743353 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxgfw/must-gather-wddl9" Apr 20 20:28:28.872150 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:28.872117 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kxgfw/must-gather-wddl9"] Apr 20 20:28:28.875326 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:28:28.875278 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod698f369e_5ccd_4915_b909_8f26dcabb1e2.slice/crio-d233340e6f31bbdd7c1180085fc1af515e85bf270b6fbada6a853979fddb776e WatchSource:0}: Error finding container d233340e6f31bbdd7c1180085fc1af515e85bf270b6fbada6a853979fddb776e: Status 404 returned error can't find the container with id d233340e6f31bbdd7c1180085fc1af515e85bf270b6fbada6a853979fddb776e Apr 20 20:28:28.877166 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:28.877151 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:28:29.648434 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:29.648398 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxgfw/must-gather-wddl9" event={"ID":"698f369e-5ccd-4915-b909-8f26dcabb1e2","Type":"ContainerStarted","Data":"d233340e6f31bbdd7c1180085fc1af515e85bf270b6fbada6a853979fddb776e"} Apr 20 20:28:30.658945 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:30.658908 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxgfw/must-gather-wddl9" event={"ID":"698f369e-5ccd-4915-b909-8f26dcabb1e2","Type":"ContainerStarted","Data":"f828613f18eededc25073c9e6253682678012fa7509a1304bc971cc94e5b1aa7"} Apr 20 20:28:30.658945 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:30.658948 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxgfw/must-gather-wddl9" event={"ID":"698f369e-5ccd-4915-b909-8f26dcabb1e2","Type":"ContainerStarted","Data":"e8bb2a402da97c518b605db918cf13c87b6de40fbde86cefd2f4c88f1bafedd7"} Apr 20 20:28:30.673410 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:30.673340 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kxgfw/must-gather-wddl9" podStartSLOduration=1.842966815 podStartE2EDuration="2.673317131s" podCreationTimestamp="2026-04-20 20:28:28 +0000 UTC" firstStartedPulling="2026-04-20 20:28:28.877278562 +0000 UTC m=+1352.427145155" lastFinishedPulling="2026-04-20 20:28:29.707628877 +0000 UTC m=+1353.257495471" observedRunningTime="2026-04-20 20:28:30.672649317 +0000 UTC m=+1354.222515931" watchObservedRunningTime="2026-04-20 20:28:30.673317131 +0000 UTC m=+1354.223183744" Apr 20 20:28:31.195413 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:31.195376 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-wvq2j_dd310fd8-ff36-47d6-9dbf-d8d029c30747/global-pull-secret-syncer/0.log" Apr 20 20:28:31.251539 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:31.251508 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-9rvtp_309654ff-b783-421b-91bd-5ae144783aa3/konnectivity-agent/0.log" Apr 20 20:28:31.346134 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:31.346106 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-158.ec2.internal_9b4440db6557536c217fdb95da13736d/haproxy/0.log" Apr 20 20:28:34.989551 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:34.989503 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fmx9j_e818016d-1b0a-49e0-8d0c-80e383b686e8/node-exporter/0.log" Apr 20 20:28:35.014223 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:35.014190 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fmx9j_e818016d-1b0a-49e0-8d0c-80e383b686e8/kube-rbac-proxy/0.log" Apr 20 20:28:35.039075 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:35.039043 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fmx9j_e818016d-1b0a-49e0-8d0c-80e383b686e8/init-textfile/0.log" Apr 20 20:28:38.906982 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:38.906949 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7ssck_820f3779-0686-4cab-81ea-d64fa84a9bde/dns/0.log" Apr 20 20:28:38.939085 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:38.939047 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n"] Apr 20 20:28:38.940548 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:38.940521 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7ssck_820f3779-0686-4cab-81ea-d64fa84a9bde/kube-rbac-proxy/0.log" Apr 20 20:28:38.943231 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:38.943207 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:38.952105 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:38.952080 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n"] Apr 20 20:28:39.023253 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.023162 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4qgql_6f6d52d5-73cf-459b-a235-e5cfe1d91c81/dns-node-resolver/0.log" Apr 20 20:28:39.064694 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.064654 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a779e4b1-7696-4c60-a61d-284a14ca157f-lib-modules\") pod \"perf-node-gather-daemonset-qv84n\" (UID: \"a779e4b1-7696-4c60-a61d-284a14ca157f\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:39.064694 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.064691 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a779e4b1-7696-4c60-a61d-284a14ca157f-podres\") pod \"perf-node-gather-daemonset-qv84n\" (UID: \"a779e4b1-7696-4c60-a61d-284a14ca157f\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:39.064969 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.064804 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwjl5\" (UniqueName: \"kubernetes.io/projected/a779e4b1-7696-4c60-a61d-284a14ca157f-kube-api-access-hwjl5\") pod \"perf-node-gather-daemonset-qv84n\" (UID: \"a779e4b1-7696-4c60-a61d-284a14ca157f\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:39.064969 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.064838 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a779e4b1-7696-4c60-a61d-284a14ca157f-sys\") pod \"perf-node-gather-daemonset-qv84n\" (UID: \"a779e4b1-7696-4c60-a61d-284a14ca157f\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:39.064969 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.064859 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a779e4b1-7696-4c60-a61d-284a14ca157f-proc\") pod \"perf-node-gather-daemonset-qv84n\" (UID: \"a779e4b1-7696-4c60-a61d-284a14ca157f\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:39.165606 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.165503 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a779e4b1-7696-4c60-a61d-284a14ca157f-lib-modules\") pod \"perf-node-gather-daemonset-qv84n\" (UID: \"a779e4b1-7696-4c60-a61d-284a14ca157f\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:39.165606 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.165550 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a779e4b1-7696-4c60-a61d-284a14ca157f-podres\") pod \"perf-node-gather-daemonset-qv84n\" (UID: \"a779e4b1-7696-4c60-a61d-284a14ca157f\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:39.165834 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.165609 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwjl5\" (UniqueName: \"kubernetes.io/projected/a779e4b1-7696-4c60-a61d-284a14ca157f-kube-api-access-hwjl5\") pod \"perf-node-gather-daemonset-qv84n\" (UID: \"a779e4b1-7696-4c60-a61d-284a14ca157f\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:39.165834 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.165641 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a779e4b1-7696-4c60-a61d-284a14ca157f-sys\") pod \"perf-node-gather-daemonset-qv84n\" (UID: \"a779e4b1-7696-4c60-a61d-284a14ca157f\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:39.165834 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.165663 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a779e4b1-7696-4c60-a61d-284a14ca157f-proc\") pod \"perf-node-gather-daemonset-qv84n\" (UID: \"a779e4b1-7696-4c60-a61d-284a14ca157f\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:39.165834 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.165697 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a779e4b1-7696-4c60-a61d-284a14ca157f-lib-modules\") pod \"perf-node-gather-daemonset-qv84n\" (UID: \"a779e4b1-7696-4c60-a61d-284a14ca157f\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:39.165834 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.165727 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a779e4b1-7696-4c60-a61d-284a14ca157f-podres\") pod \"perf-node-gather-daemonset-qv84n\" (UID: \"a779e4b1-7696-4c60-a61d-284a14ca157f\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:39.165834 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.165728 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a779e4b1-7696-4c60-a61d-284a14ca157f-sys\") pod \"perf-node-gather-daemonset-qv84n\" (UID: \"a779e4b1-7696-4c60-a61d-284a14ca157f\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:39.165834 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.165752 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a779e4b1-7696-4c60-a61d-284a14ca157f-proc\") pod \"perf-node-gather-daemonset-qv84n\" (UID: \"a779e4b1-7696-4c60-a61d-284a14ca157f\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:39.173608 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.173574 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwjl5\" (UniqueName: \"kubernetes.io/projected/a779e4b1-7696-4c60-a61d-284a14ca157f-kube-api-access-hwjl5\") pod \"perf-node-gather-daemonset-qv84n\" (UID: \"a779e4b1-7696-4c60-a61d-284a14ca157f\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:39.255343 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.255304 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:39.403320 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.403268 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n"] Apr 20 20:28:39.407056 ip-10-0-136-158 kubenswrapper[2573]: W0420 20:28:39.407022 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda779e4b1_7696_4c60_a61d_284a14ca157f.slice/crio-6bc212f2794dc983a305fac5cdd8446c0be90cc411010e7101d1d7ef5553c677 WatchSource:0}: Error finding container 6bc212f2794dc983a305fac5cdd8446c0be90cc411010e7101d1d7ef5553c677: Status 404 returned error can't find the container with id 6bc212f2794dc983a305fac5cdd8446c0be90cc411010e7101d1d7ef5553c677 Apr 20 20:28:39.513968 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.513937 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-dt2xp_aa0188c0-4215-47d0-a910-5a4c74cbc7cc/node-ca/0.log" Apr 20 20:28:39.694755 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.694659 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" event={"ID":"a779e4b1-7696-4c60-a61d-284a14ca157f","Type":"ContainerStarted","Data":"eac7994f173433c9266f863a8f96475264bffa487f4a52f6aece6a7427ac0431"} Apr 20 20:28:39.694755 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.694707 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" event={"ID":"a779e4b1-7696-4c60-a61d-284a14ca157f","Type":"ContainerStarted","Data":"6bc212f2794dc983a305fac5cdd8446c0be90cc411010e7101d1d7ef5553c677"} Apr 20 20:28:39.694982 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.694823 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:39.710502 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:39.710440 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" podStartSLOduration=1.710422949 podStartE2EDuration="1.710422949s" podCreationTimestamp="2026-04-20 20:28:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:28:39.709424403 +0000 UTC m=+1363.259291020" watchObservedRunningTime="2026-04-20 20:28:39.710422949 +0000 UTC m=+1363.260289563" Apr 20 20:28:40.531976 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:40.531936 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-566sx_a5f9fd3a-20c3-49e2-860d-0b343b78d891/serve-healthcheck-canary/0.log" Apr 20 20:28:40.926525 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:40.926499 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-58cpq_2ed5cae5-de16-4920-ba49-c4347715ab85/kube-rbac-proxy/0.log" Apr 20 20:28:40.950909 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:40.950877 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-58cpq_2ed5cae5-de16-4920-ba49-c4347715ab85/exporter/0.log" Apr 20 20:28:40.972745 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:40.972683 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-58cpq_2ed5cae5-de16-4920-ba49-c4347715ab85/extractor/0.log" Apr 20 20:28:43.185073 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:43.185040 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-r898s_9e939f77-2e54-4923-9373-24d56cc11538/manager/0.log" Apr 20 20:28:45.707822 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:45.707793 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-qv84n" Apr 20 20:28:48.149682 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:48.149652 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5r4lb_f3332e18-c5c0-47a4-a5ed-4b719b4bc831/kube-multus/0.log" Apr 20 20:28:48.363026 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:48.362998 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9zzt_99945c0f-09c6-48fb-84b1-4299b5936bd6/kube-multus-additional-cni-plugins/0.log" Apr 20 20:28:48.383575 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:48.383548 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9zzt_99945c0f-09c6-48fb-84b1-4299b5936bd6/egress-router-binary-copy/0.log" Apr 20 20:28:48.404931 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:48.404849 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9zzt_99945c0f-09c6-48fb-84b1-4299b5936bd6/cni-plugins/0.log" Apr 20 20:28:48.427445 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:48.427415 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9zzt_99945c0f-09c6-48fb-84b1-4299b5936bd6/bond-cni-plugin/0.log" Apr 20 20:28:48.450990 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:48.450957 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9zzt_99945c0f-09c6-48fb-84b1-4299b5936bd6/routeoverride-cni/0.log" Apr 20 20:28:48.471808 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:48.471770 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9zzt_99945c0f-09c6-48fb-84b1-4299b5936bd6/whereabouts-cni-bincopy/0.log" Apr 20 20:28:48.491787 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:48.491757 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9zzt_99945c0f-09c6-48fb-84b1-4299b5936bd6/whereabouts-cni/0.log" Apr 20 20:28:48.735003 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:48.734910 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-d2g4l_9f213e16-074a-493b-b57c-f84483b57308/network-metrics-daemon/0.log" Apr 20 20:28:48.756595 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:48.756566 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-d2g4l_9f213e16-074a-493b-b57c-f84483b57308/kube-rbac-proxy/0.log" Apr 20 20:28:49.752437 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:49.752405 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/ovn-controller/0.log" Apr 20 20:28:49.768646 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:49.768615 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/ovn-acl-logging/0.log" Apr 20 20:28:49.781452 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:49.781417 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/ovn-acl-logging/1.log" Apr 20 20:28:49.807452 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:49.807419 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/kube-rbac-proxy-node/0.log" Apr 20 20:28:49.829949 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:49.829919 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 20:28:49.847371 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:49.847339 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/northd/0.log" Apr 20 20:28:49.868496 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:49.868468 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/nbdb/0.log" Apr 20 20:28:49.888756 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:49.888720 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/sbdb/0.log" Apr 20 20:28:50.077468 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:50.077373 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-664zl_c5c20964-6b44-4902-91b4-e2f99aceca2f/ovnkube-controller/0.log" Apr 20 20:28:51.331542 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:51.331517 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-t9khm_5409aec2-613d-49b4-aad6-5dda25f70168/network-check-target-container/0.log" Apr 20 20:28:52.183779 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:52.183743 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-m557q_01f7cf40-a02f-4e4d-846f-75cdf011fbb1/iptables-alerter/0.log" Apr 20 20:28:52.932436 ip-10-0-136-158 kubenswrapper[2573]: I0420 20:28:52.932403 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-zr8nd_7f785e5a-c4aa-40db-a581-8d086c1bf8cb/tuned/0.log"