Apr 24 21:26:37.111419 ip-10-0-133-209 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:26:37.531344 ip-10-0-133-209 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:37.531344 ip-10-0-133-209 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:26:37.531344 ip-10-0-133-209 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:37.531344 ip-10-0-133-209 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:26:37.531344 ip-10-0-133-209 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:37.532800 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.532710 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:26:37.539372 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539348 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:37.539372 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539367 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:37.539372 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539371 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:37.539372 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539374 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:37.539372 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539378 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539382 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539386 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539389 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539393 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539396 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539399 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539401 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539404 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539407 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539410 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539413 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539415 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539418 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539421 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539424 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539426 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539429 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539431 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539442 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:37.539589 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539445 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539447 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539450 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539453 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539456 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539458 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539461 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539464 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539466 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539469 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539472 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539476 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539479 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539482 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539486 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539490 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539492 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539496 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539498 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539501 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:37.540059 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539504 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539508 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539511 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539515 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539518 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539521 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539523 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539526 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539529 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539531 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539534 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539537 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539539 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539542 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539545 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539547 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539550 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539554 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539558 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:37.540616 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539562 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539564 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539567 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539569 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539572 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539574 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539577 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539580 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539583 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539586 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539588 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539591 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539593 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539596 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539599 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539602 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539605 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539608 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539610 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539613 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:37.541110 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539616 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539618 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.539621 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540046 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540052 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540055 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540059 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540062 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540065 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540068 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540071 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540074 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540077 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540080 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540082 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540085 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540088 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540090 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540093 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540111 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:37.541601 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540114 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540117 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540120 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540122 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540125 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540128 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540130 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540134 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540137 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540139 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540142 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540145 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540148 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540150 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540153 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540156 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540158 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540161 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540164 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540166 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:37.542135 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540169 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540172 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540174 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540177 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540180 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540182 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540186 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540188 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540191 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540193 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540196 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540200 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540203 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540206 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540208 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540211 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540213 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540216 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540219 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540221 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:37.542686 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540224 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540227 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540229 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540232 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540235 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540237 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540240 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540242 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540245 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540248 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540250 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540253 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540255 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540258 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540260 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540263 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540267 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540270 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540274 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:37.543188 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540278 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540282 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540285 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540288 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540291 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540295 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540297 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540300 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540302 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.540305 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540380 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540388 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540394 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540398 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540404 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540408 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540412 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540418 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540421 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540424 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540428 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:26:37.543656 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540432 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540435 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540438 2574 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540440 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540443 2574 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540446 2574 flags.go:64] FLAG: --cloud-config="" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540450 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540453 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540457 2574 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540460 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540463 2574 flags.go:64] FLAG: --config-dir="" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540467 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540470 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540478 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540482 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540485 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540489 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540492 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540495 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540498 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540501 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540504 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540508 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540511 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540515 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:26:37.544198 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540518 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540521 2574 flags.go:64] FLAG: --enable-server="true" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540524 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540529 2574 flags.go:64] FLAG: --event-burst="100" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540532 2574 flags.go:64] FLAG: --event-qps="50" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540535 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540538 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540541 2574 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540545 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540548 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540551 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540554 2574 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540557 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540560 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540563 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540566 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540569 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540572 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540575 2574 flags.go:64] FLAG: --feature-gates="" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540579 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540582 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540586 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540589 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540592 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540595 2574 flags.go:64] FLAG: --help="false" Apr 24 21:26:37.544851 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540598 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-133-209.ec2.internal" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540601 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540604 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540607 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540611 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540615 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540617 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540620 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540624 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540627 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540631 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540634 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540637 2574 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540640 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540643 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540646 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540649 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540651 2574 flags.go:64] FLAG: --lock-file="" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540654 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540657 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540660 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540666 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540669 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540672 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:26:37.545489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540675 2574 flags.go:64] FLAG: --logging-format="text" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540677 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540681 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540684 2574 flags.go:64] FLAG: --manifest-url="" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540686 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540691 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540694 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540698 2574 flags.go:64] FLAG: --max-pods="110" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540701 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540704 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540707 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540710 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540713 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540716 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540719 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540729 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540732 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540735 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540739 2574 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540742 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540748 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540751 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540754 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540757 2574 flags.go:64] FLAG: --port="10250" Apr 24 21:26:37.546082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540760 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540763 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06b5f3b0f31ca0f2f" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540766 2574 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540772 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540777 2574 flags.go:64] FLAG: --register-node="true" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540780 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540783 2574 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540787 2574 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540790 2574 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540793 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540796 2574 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540799 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540803 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540806 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540808 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540811 2574 flags.go:64] FLAG: --runonce="false" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540815 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540817 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540820 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540823 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540826 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540830 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540833 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540836 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540839 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540842 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:26:37.546679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540845 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540854 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540857 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540860 2574 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540863 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540868 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540871 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540874 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540878 2574 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540882 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540887 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540890 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540893 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540896 2574 flags.go:64] FLAG: --v="2" Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540905 2574 flags.go:64] FLAG: --version="false" Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540909 2574 flags.go:64] FLAG: --vmodule="" Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540914 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.540917 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541008 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541011 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541014 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541017 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541021 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:37.547355 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541025 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541028 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541031 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541034 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541036 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541039 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541042 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541045 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541048 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541050 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541053 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541056 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541059 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541061 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541064 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541066 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541069 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541071 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541075 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541079 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:37.547906 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541082 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541085 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541087 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541090 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541092 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541108 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541111 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541114 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541117 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541120 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541123 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541126 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541128 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541131 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541134 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541136 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541139 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541141 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541144 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541147 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:37.548464 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541150 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541152 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541155 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541158 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541161 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541164 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541167 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541169 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541172 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541174 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541178 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541183 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541185 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541188 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541190 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541193 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541196 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541198 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541201 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541204 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:37.548964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541206 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541209 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541211 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541214 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541217 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541219 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541224 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541227 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541230 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541233 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541236 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541239 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541242 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541244 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541247 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541249 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541252 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541255 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541258 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:37.549482 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541260 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:37.549964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.541262 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:37.549964 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.541924 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:37.549964 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.548381 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:26:37.549964 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.548398 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:26:37.549964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548448 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:37.549964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548453 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:37.549964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548457 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:37.549964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548460 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:37.549964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548463 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:37.549964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548466 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:37.549964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548470 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:37.549964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548474 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:37.549964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548477 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:37.549964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548480 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:37.549964 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548483 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548486 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548489 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548492 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548494 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548497 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548500 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548503 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548505 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548508 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548510 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548513 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548518 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548522 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548525 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548535 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548538 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548541 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548544 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548547 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:37.550377 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548549 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548553 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548557 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548560 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548562 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548565 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548568 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548571 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548573 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548576 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548579 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548581 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548584 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548586 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548589 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548592 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548595 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548597 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548600 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:37.550883 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548602 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548605 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548607 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548610 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548613 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548615 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548618 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548620 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548623 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548626 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548629 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548632 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548634 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548636 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548640 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548643 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548646 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548649 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548652 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548654 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:37.551359 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548657 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:37.551885 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548659 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:37.551885 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548662 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:37.551885 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548664 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:37.551885 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548667 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:37.551885 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548669 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:37.551885 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548672 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:37.551885 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548674 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:37.551885 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548677 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:37.551885 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548679 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:37.551885 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548682 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:37.551885 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548684 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:37.551885 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548687 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:37.551885 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548689 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:37.551885 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548692 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:37.551885 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548694 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:37.551885 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548697 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:37.552306 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.548702 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:37.552306 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548802 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:37.552306 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548806 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:37.552306 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548809 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:37.552306 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548812 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:37.552306 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548815 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:37.552306 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548818 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:37.552306 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548821 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:37.552306 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548823 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:37.552306 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548826 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:37.552306 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548830 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:37.552306 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548834 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:37.552306 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548839 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:37.552306 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548843 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:37.552306 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548846 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548849 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548852 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548855 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548858 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548861 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548864 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548867 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548870 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548873 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548876 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548879 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548882 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548884 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548887 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548889 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548892 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548894 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548897 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548899 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:37.552845 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548902 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548904 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548907 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548909 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548913 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548915 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548918 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548921 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548924 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548927 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548930 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548932 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548935 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548938 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548940 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548943 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548945 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548948 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548950 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548953 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:37.553393 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548955 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548958 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548961 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548963 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548966 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548969 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548972 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548974 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548977 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548980 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548982 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548985 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548988 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548991 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548993 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548996 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.548998 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.549001 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.549004 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.549007 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:37.553879 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.549010 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:37.554462 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.549013 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:37.554462 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.549015 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:37.554462 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.549017 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:37.554462 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.549020 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:37.554462 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.549022 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:37.554462 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.549025 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:37.554462 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.549028 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:37.554462 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.549030 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:37.554462 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.549033 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:37.554462 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.549035 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:37.554462 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.549038 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:37.554462 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:37.549040 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:37.554462 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.549045 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:37.554462 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.549815 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:26:37.554462 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.552326 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:26:37.554831 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.553199 2574 server.go:1019] "Starting client certificate rotation" Apr 24 21:26:37.554831 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.553295 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:26:37.554831 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.553331 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:26:37.576818 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.576677 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:26:37.579324 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.579299 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:26:37.599026 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.599005 2574 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:26:37.603508 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.603486 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:26:37.604159 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.604136 2574 log.go:25] "Validated CRI v1 image API" Apr 24 21:26:37.605255 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.605235 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:26:37.607430 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.607409 2574 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 c77f0d3b-27b5-4923-b671-d87d3e7f5b1a:/dev/nvme0n1p4 cd240714-de01-41df-bd7d-25d93026007f:/dev/nvme0n1p3] Apr 24 21:26:37.607485 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.607430 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:26:37.613036 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.612928 2574 manager.go:217] Machine: {Timestamp:2026-04-24 21:26:37.611108144 +0000 UTC m=+0.383645242 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3103037 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec213b1c3296922d10425d22659b02aa SystemUUID:ec213b1c-3296-922d-1042-5d22659b02aa BootID:4212022f-8956-4455-b72b-4a92cc03f8e3 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ad:80:bc:54:9b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ad:80:bc:54:9b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2a:4e:0e:34:05:a2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:26:37.613036 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.613031 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:26:37.613147 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.613125 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:26:37.615127 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.615090 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:26:37.615270 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.615129 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-209.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:26:37.615311 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.615279 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:26:37.615311 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.615289 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:26:37.615311 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.615302 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:26:37.616214 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.616204 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:26:37.617355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.617345 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:26:37.617476 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.617467 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:26:37.620172 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.620161 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:26:37.620210 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.620181 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:26:37.620210 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.620192 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:26:37.620210 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.620202 2574 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:26:37.620343 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.620211 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:26:37.621295 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.621281 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:26:37.621295 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.621299 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:26:37.624280 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.624266 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:26:37.625543 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.625527 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:26:37.627052 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.627037 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:26:37.627052 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.627053 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:26:37.627052 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.627060 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:26:37.627052 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.627065 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:26:37.627286 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.627070 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:26:37.627286 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.627076 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:26:37.627286 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.627083 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:26:37.627286 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.627088 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:26:37.627286 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.627123 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:26:37.627286 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.627130 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:26:37.627286 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.627145 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:26:37.627286 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.627154 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:26:37.628130 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.628118 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:26:37.628170 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.628135 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:26:37.629741 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:37.629721 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-209.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:26:37.630363 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.630342 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-j8sks" Apr 24 21:26:37.630363 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:37.630347 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:26:37.631958 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.631941 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:26:37.632032 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.631976 2574 server.go:1295] "Started kubelet" Apr 24 21:26:37.632032 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.632012 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-209.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:26:37.632151 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.632078 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:26:37.632151 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.632069 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:26:37.632236 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.632164 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:26:37.632878 ip-10-0-133-209 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:26:37.633274 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.633209 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:26:37.637772 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.637747 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-j8sks" Apr 24 21:26:37.638240 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.638155 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:26:37.641425 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:37.641398 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:26:37.643247 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.643221 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:26:37.643661 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.643641 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:26:37.644351 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.644332 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:26:37.644351 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.644335 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:26:37.644505 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.644366 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:26:37.644505 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.644417 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:26:37.644505 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.644428 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:26:37.644635 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:37.644488 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-209.ec2.internal\" not found" Apr 24 21:26:37.644686 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.644671 2574 factory.go:55] Registering systemd factory Apr 24 21:26:37.644723 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.644691 2574 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:26:37.644935 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.644924 2574 factory.go:153] Registering CRI-O factory Apr 24 21:26:37.644978 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.644938 2574 factory.go:223] Registration of the crio container factory successfully Apr 24 21:26:37.645017 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.644992 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:26:37.645048 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.645031 2574 factory.go:103] Registering Raw factory Apr 24 21:26:37.645048 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.645043 2574 manager.go:1196] Started watching for new ooms in manager Apr 24 21:26:37.645427 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.645412 2574 manager.go:319] Starting recovery of all containers Apr 24 21:26:37.646258 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.646235 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:37.652245 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:37.652222 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-209.ec2.internal\" not found" node="ip-10-0-133-209.ec2.internal" Apr 24 21:26:37.653778 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.653763 2574 manager.go:324] Recovery completed Apr 24 21:26:37.655083 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:37.655058 2574 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 24 21:26:37.657993 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.657978 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:37.661039 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.661025 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:37.661088 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.661054 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:37.661088 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.661063 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:37.661564 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.661549 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:26:37.661613 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.661565 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:26:37.661613 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.661584 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:26:37.664696 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.664682 2574 policy_none.go:49] "None policy: Start" Apr 24 21:26:37.664763 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.664699 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:26:37.664763 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.664709 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:26:37.704750 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.704732 2574 manager.go:341] "Starting Device Plugin manager" Apr 24 21:26:37.728489 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:37.704766 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:26:37.728489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.704777 2574 server.go:85] "Starting device plugin registration server" Apr 24 21:26:37.728489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.705018 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:26:37.728489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.705032 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:26:37.728489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.705128 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:26:37.728489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.705197 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:26:37.728489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.705204 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:26:37.728489 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:37.705935 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:26:37.728489 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:37.705975 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-209.ec2.internal\" not found" Apr 24 21:26:37.777403 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.777370 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:26:37.778655 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.778628 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:26:37.778761 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.778661 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:26:37.778761 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.778684 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:26:37.778761 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.778693 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:26:37.778761 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:37.778733 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:26:37.782947 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.782897 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:37.805984 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.805960 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:37.807019 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.807001 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:37.807114 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.807037 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:37.807114 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.807050 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:37.807114 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.807072 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-209.ec2.internal" Apr 24 21:26:37.813818 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.813795 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-209.ec2.internal" Apr 24 21:26:37.813903 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:37.813821 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-209.ec2.internal\": node \"ip-10-0-133-209.ec2.internal\" not found" Apr 24 21:26:37.833247 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:37.833228 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-209.ec2.internal\" not found" Apr 24 21:26:37.879060 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.879029 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-209.ec2.internal"] Apr 24 21:26:37.879164 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.879141 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:37.881063 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.881049 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:37.881132 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.881078 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:37.881132 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.881089 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:37.883310 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.883297 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:37.883494 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.883479 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal" Apr 24 21:26:37.883536 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.883508 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:37.884061 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.884042 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:37.884156 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.884064 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:37.884156 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.884075 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:37.884156 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.884083 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:37.884156 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.884091 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:37.884156 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.884114 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:37.886338 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.886321 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-209.ec2.internal" Apr 24 21:26:37.886381 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.886356 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:37.887110 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.887076 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:37.887182 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.887126 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:37.887182 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:37.887136 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:37.897659 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:37.897637 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-209.ec2.internal\" not found" node="ip-10-0-133-209.ec2.internal" Apr 24 21:26:37.900971 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:37.900955 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-209.ec2.internal\" not found" node="ip-10-0-133-209.ec2.internal" Apr 24 21:26:37.934160 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:37.934130 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-209.ec2.internal\" not found" Apr 24 21:26:38.034390 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:38.034315 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-209.ec2.internal\" not found" Apr 24 21:26:38.045701 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.045677 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ef10e2637f5c12380b7bcf89d82ff9d2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal\" (UID: \"ef10e2637f5c12380b7bcf89d82ff9d2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal" Apr 24 21:26:38.045795 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.045706 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef10e2637f5c12380b7bcf89d82ff9d2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal\" (UID: \"ef10e2637f5c12380b7bcf89d82ff9d2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal" Apr 24 21:26:38.045795 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.045738 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fa9ef8c20bc608bfc7fe350ef3c4a29b-config\") pod \"kube-apiserver-proxy-ip-10-0-133-209.ec2.internal\" (UID: \"fa9ef8c20bc608bfc7fe350ef3c4a29b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-209.ec2.internal" Apr 24 21:26:38.135224 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:38.135182 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-209.ec2.internal\" not found" Apr 24 21:26:38.146553 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.146534 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ef10e2637f5c12380b7bcf89d82ff9d2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal\" (UID: \"ef10e2637f5c12380b7bcf89d82ff9d2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal" Apr 24 21:26:38.146633 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.146562 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef10e2637f5c12380b7bcf89d82ff9d2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal\" (UID: \"ef10e2637f5c12380b7bcf89d82ff9d2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal" Apr 24 21:26:38.146633 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.146580 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fa9ef8c20bc608bfc7fe350ef3c4a29b-config\") pod \"kube-apiserver-proxy-ip-10-0-133-209.ec2.internal\" (UID: \"fa9ef8c20bc608bfc7fe350ef3c4a29b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-209.ec2.internal" Apr 24 21:26:38.146633 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.146622 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fa9ef8c20bc608bfc7fe350ef3c4a29b-config\") pod \"kube-apiserver-proxy-ip-10-0-133-209.ec2.internal\" (UID: \"fa9ef8c20bc608bfc7fe350ef3c4a29b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-209.ec2.internal" Apr 24 21:26:38.146750 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.146645 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef10e2637f5c12380b7bcf89d82ff9d2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal\" (UID: \"ef10e2637f5c12380b7bcf89d82ff9d2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal" Apr 24 21:26:38.146750 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.146648 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ef10e2637f5c12380b7bcf89d82ff9d2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal\" (UID: \"ef10e2637f5c12380b7bcf89d82ff9d2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal" Apr 24 21:26:38.199669 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.199640 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal" Apr 24 21:26:38.203181 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.203160 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-209.ec2.internal" Apr 24 21:26:38.235655 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:38.235625 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-209.ec2.internal\" not found" Apr 24 21:26:38.336169 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:38.336071 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-209.ec2.internal\" not found" Apr 24 21:26:38.436643 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:38.436608 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-209.ec2.internal\" not found" Apr 24 21:26:38.537216 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:38.537170 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-209.ec2.internal\" not found" Apr 24 21:26:38.553701 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.553676 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:26:38.553832 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.553818 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:26:38.553873 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.553851 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:26:38.637864 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:38.637831 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-209.ec2.internal\" not found" Apr 24 21:26:38.640538 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.640507 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:21:37 +0000 UTC" deadline="2027-10-26 10:49:46.203274323 +0000 UTC" Apr 24 21:26:38.640538 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.640531 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13189h23m7.562746547s" Apr 24 21:26:38.644047 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.644027 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:26:38.656260 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.656230 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:26:38.674004 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.673981 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-lg2q8" Apr 24 21:26:38.681840 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.681820 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-lg2q8" Apr 24 21:26:38.726536 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:38.726497 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef10e2637f5c12380b7bcf89d82ff9d2.slice/crio-0c086eb64f9c8c5ed726504b8b0ed7dafcc9bb709ed28a7f350495ac7234dad1 WatchSource:0}: Error finding container 0c086eb64f9c8c5ed726504b8b0ed7dafcc9bb709ed28a7f350495ac7234dad1: Status 404 returned error can't find the container with id 0c086eb64f9c8c5ed726504b8b0ed7dafcc9bb709ed28a7f350495ac7234dad1 Apr 24 21:26:38.734007 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.733988 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:26:38.738724 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:38.738700 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-209.ec2.internal\" not found" Apr 24 21:26:38.740016 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:38.739991 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa9ef8c20bc608bfc7fe350ef3c4a29b.slice/crio-80907961e93bebe35c611a3bb904be633af8921053c50c8a388438c92e99f13a WatchSource:0}: Error finding container 80907961e93bebe35c611a3bb904be633af8921053c50c8a388438c92e99f13a: Status 404 returned error can't find the container with id 80907961e93bebe35c611a3bb904be633af8921053c50c8a388438c92e99f13a Apr 24 21:26:38.781883 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.781837 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-209.ec2.internal" event={"ID":"fa9ef8c20bc608bfc7fe350ef3c4a29b","Type":"ContainerStarted","Data":"80907961e93bebe35c611a3bb904be633af8921053c50c8a388438c92e99f13a"} Apr 24 21:26:38.782942 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.782913 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal" event={"ID":"ef10e2637f5c12380b7bcf89d82ff9d2","Type":"ContainerStarted","Data":"0c086eb64f9c8c5ed726504b8b0ed7dafcc9bb709ed28a7f350495ac7234dad1"} Apr 24 21:26:38.814378 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:38.814354 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:38.839779 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:38.839745 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-209.ec2.internal\" not found" Apr 24 21:26:38.940312 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:38.940229 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-209.ec2.internal\" not found" Apr 24 21:26:39.040740 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:39.040703 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-209.ec2.internal\" not found" Apr 24 21:26:39.131387 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.131358 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:39.144062 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.144034 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal" Apr 24 21:26:39.156355 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.156212 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:26:39.157418 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.157169 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-209.ec2.internal" Apr 24 21:26:39.163277 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.163257 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:26:39.563115 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.561716 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:39.621090 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.621048 2574 apiserver.go:52] "Watching apiserver" Apr 24 21:26:39.628674 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.628652 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:26:39.629767 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.629747 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw","openshift-cluster-node-tuning-operator/tuned-h4s4s","openshift-network-diagnostics/network-check-target-mzzxk","openshift-network-operator/iptables-alerter-wxlkl","openshift-dns/node-resolver-6jctc","openshift-image-registry/node-ca-sfdpv","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal","openshift-multus/multus-additional-cni-plugins-hg8hv","openshift-multus/multus-dqnl7","openshift-multus/network-metrics-daemon-c4ck8","openshift-ovn-kubernetes/ovnkube-node-9jw7k","kube-system/konnectivity-agent-tf6b4","kube-system/kube-apiserver-proxy-ip-10-0-133-209.ec2.internal"] Apr 24 21:26:39.632801 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.632782 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:39.632899 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:39.632878 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4ck8" podUID="52e9a5f8-832a-4f1e-add3-f10bf674757e" Apr 24 21:26:39.637014 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.636953 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.639283 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.639035 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:26:39.639283 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:39.639115 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mzzxk" podUID="1daccc38-8893-4df5-b7d6-357c27b4e705" Apr 24 21:26:39.639283 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.639156 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wxlkl" Apr 24 21:26:39.639457 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.639345 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:39.639457 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.639420 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2qtt6\"" Apr 24 21:26:39.639457 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.639396 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:39.641242 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.641224 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:26:39.641347 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.641295 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6jctc" Apr 24 21:26:39.641707 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.641685 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-752gp\"" Apr 24 21:26:39.641829 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.641733 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:39.641829 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.641740 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:39.643485 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.643445 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sfdpv" Apr 24 21:26:39.643592 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.643551 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-ntglr\"" Apr 24 21:26:39.643742 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.643722 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:26:39.643825 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.643773 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:26:39.645503 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.645483 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:26:39.645834 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.645758 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:26:39.645917 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.645868 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.646015 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.645999 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:26:39.646405 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.646389 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-mmf58\"" Apr 24 21:26:39.648068 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.648047 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.648521 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.648501 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:26:39.648599 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.648519 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:26:39.648654 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.648601 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:26:39.648654 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.648616 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:26:39.648738 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.648721 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:26:39.648783 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.648722 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jztmk\"" Apr 24 21:26:39.650561 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.650411 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.653792 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.651736 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:26:39.653792 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.652083 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-7gtpj\"" Apr 24 21:26:39.653792 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.653148 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:26:39.653792 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.653274 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2jn2n\"" Apr 24 21:26:39.653792 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.653662 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:26:39.654287 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654203 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.654287 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654250 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3fe6ee90-5f50-4532-8f34-d91e4dc1fccd-hosts-file\") pod \"node-resolver-6jctc\" (UID: \"3fe6ee90-5f50-4532-8f34-d91e4dc1fccd\") " pod="openshift-dns/node-resolver-6jctc" Apr 24 21:26:39.654287 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654272 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3fe6ee90-5f50-4532-8f34-d91e4dc1fccd-tmp-dir\") pod \"node-resolver-6jctc\" (UID: \"3fe6ee90-5f50-4532-8f34-d91e4dc1fccd\") " pod="openshift-dns/node-resolver-6jctc" Apr 24 21:26:39.654433 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654303 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/572a1510-69a8-48bf-94b2-311fd0c0d92f-host\") pod \"node-ca-sfdpv\" (UID: \"572a1510-69a8-48bf-94b2-311fd0c0d92f\") " pod="openshift-image-registry/node-ca-sfdpv" Apr 24 21:26:39.654433 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654334 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-etc-sysctl-conf\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.654433 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654363 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-etc-sysctl-d\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.654588 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654492 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-etc-systemd\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.654588 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654539 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-run\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.654588 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654576 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhg48\" (UniqueName: \"kubernetes.io/projected/c3c7543c-7f18-44b0-b64e-519be8319862-kube-api-access-jhg48\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.654729 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654610 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/967d645a-e8ea-4968-bac5-2446d61f1581-iptables-alerter-script\") pod \"iptables-alerter-wxlkl\" (UID: \"967d645a-e8ea-4968-bac5-2446d61f1581\") " pod="openshift-network-operator/iptables-alerter-wxlkl" Apr 24 21:26:39.654729 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654647 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/425e4fdf-8950-4297-b9c8-488b3e610f40-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.654729 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654682 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g8gp\" (UniqueName: \"kubernetes.io/projected/572a1510-69a8-48bf-94b2-311fd0c0d92f-kube-api-access-9g8gp\") pod \"node-ca-sfdpv\" (UID: \"572a1510-69a8-48bf-94b2-311fd0c0d92f\") " pod="openshift-image-registry/node-ca-sfdpv" Apr 24 21:26:39.654729 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654693 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:26:39.654729 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654710 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-etc-modprobe-d\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.654961 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654738 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-sys\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.654961 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654768 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c3c7543c-7f18-44b0-b64e-519be8319862-tmp\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.654961 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654800 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/967d645a-e8ea-4968-bac5-2446d61f1581-host-slash\") pod \"iptables-alerter-wxlkl\" (UID: \"967d645a-e8ea-4968-bac5-2446d61f1581\") " pod="openshift-network-operator/iptables-alerter-wxlkl" Apr 24 21:26:39.654961 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654832 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/425e4fdf-8950-4297-b9c8-488b3e610f40-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.654961 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654937 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs\") pod \"network-metrics-daemon-c4ck8\" (UID: \"52e9a5f8-832a-4f1e-add3-f10bf674757e\") " pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:39.655221 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.654987 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-etc-sysconfig\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.655221 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.655013 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-host\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.655221 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.655075 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/425e4fdf-8950-4297-b9c8-488b3e610f40-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.655364 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.655225 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/572a1510-69a8-48bf-94b2-311fd0c0d92f-serviceca\") pod \"node-ca-sfdpv\" (UID: \"572a1510-69a8-48bf-94b2-311fd0c0d92f\") " pod="openshift-image-registry/node-ca-sfdpv" Apr 24 21:26:39.655364 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.655263 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56lfb\" (UniqueName: \"kubernetes.io/projected/52e9a5f8-832a-4f1e-add3-f10bf674757e-kube-api-access-56lfb\") pod \"network-metrics-daemon-c4ck8\" (UID: \"52e9a5f8-832a-4f1e-add3-f10bf674757e\") " pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:39.655364 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.655299 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nw4t\" (UniqueName: \"kubernetes.io/projected/967d645a-e8ea-4968-bac5-2446d61f1581-kube-api-access-6nw4t\") pod \"iptables-alerter-wxlkl\" (UID: \"967d645a-e8ea-4968-bac5-2446d61f1581\") " pod="openshift-network-operator/iptables-alerter-wxlkl" Apr 24 21:26:39.655364 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.655359 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7n47\" (UniqueName: \"kubernetes.io/projected/3fe6ee90-5f50-4532-8f34-d91e4dc1fccd-kube-api-access-b7n47\") pod \"node-resolver-6jctc\" (UID: \"3fe6ee90-5f50-4532-8f34-d91e4dc1fccd\") " pod="openshift-dns/node-resolver-6jctc" Apr 24 21:26:39.655553 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.655390 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-etc-kubernetes\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.655553 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.655422 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-var-lib-kubelet\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.655553 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.655449 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/425e4fdf-8950-4297-b9c8-488b3e610f40-system-cni-dir\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.655553 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.655482 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/425e4fdf-8950-4297-b9c8-488b3e610f40-os-release\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.655553 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.655534 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/425e4fdf-8950-4297-b9c8-488b3e610f40-cni-binary-copy\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.655786 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.655606 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfpb7\" (UniqueName: \"kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7\") pod \"network-check-target-mzzxk\" (UID: \"1daccc38-8893-4df5-b7d6-357c27b4e705\") " pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:26:39.655786 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.655635 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-lib-modules\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.655786 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.655660 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c3c7543c-7f18-44b0-b64e-519be8319862-etc-tuned\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.655786 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.655684 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/425e4fdf-8950-4297-b9c8-488b3e610f40-cnibin\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.655786 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.655718 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xf4r\" (UniqueName: \"kubernetes.io/projected/425e4fdf-8950-4297-b9c8-488b3e610f40-kube-api-access-2xf4r\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.656542 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.656520 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-gv74c\"" Apr 24 21:26:39.656682 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.656662 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:26:39.656866 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.656803 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:26:39.656866 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.656847 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:26:39.656996 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.656899 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:26:39.657489 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.657471 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tf6b4" Apr 24 21:26:39.657866 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.657847 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:26:39.657866 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.657860 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:26:39.659707 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.659687 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:26:39.659784 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.659700 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9cvct\"" Apr 24 21:26:39.659784 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.659721 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:26:39.682911 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.682878 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:21:38 +0000 UTC" deadline="2027-11-02 02:34:50.594374639 +0000 UTC" Apr 24 21:26:39.683012 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.682912 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13349h8m10.911467019s" Apr 24 21:26:39.693554 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.693526 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:39.745544 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.745518 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:26:39.755958 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.755930 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-registration-dir\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.756110 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.755970 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-etc-kubernetes\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.756110 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.755998 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dktt5\" (UniqueName: \"kubernetes.io/projected/af21b504-2a52-42ab-82d6-71911cc6a655-kube-api-access-dktt5\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.756110 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756032 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-slash\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.756110 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756060 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-ovnkube-config\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.756110 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756090 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-etc-systemd\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.756315 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756177 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhg48\" (UniqueName: \"kubernetes.io/projected/c3c7543c-7f18-44b0-b64e-519be8319862-kube-api-access-jhg48\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.756315 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756195 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-etc-systemd\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.756315 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756233 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/967d645a-e8ea-4968-bac5-2446d61f1581-iptables-alerter-script\") pod \"iptables-alerter-wxlkl\" (UID: \"967d645a-e8ea-4968-bac5-2446d61f1581\") " pod="openshift-network-operator/iptables-alerter-wxlkl" Apr 24 21:26:39.756315 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756255 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/425e4fdf-8950-4297-b9c8-488b3e610f40-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.756315 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756270 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/425e4fdf-8950-4297-b9c8-488b3e610f40-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.756315 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756288 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.756542 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756314 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-etc-selinux\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.756542 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756404 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9g8gp\" (UniqueName: \"kubernetes.io/projected/572a1510-69a8-48bf-94b2-311fd0c0d92f-kube-api-access-9g8gp\") pod \"node-ca-sfdpv\" (UID: \"572a1510-69a8-48bf-94b2-311fd0c0d92f\") " pod="openshift-image-registry/node-ca-sfdpv" Apr 24 21:26:39.756542 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756512 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/425e4fdf-8950-4297-b9c8-488b3e610f40-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.756636 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756545 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-sys\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.756636 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756585 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv52r\" (UniqueName: \"kubernetes.io/projected/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-kube-api-access-hv52r\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.756636 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756627 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-system-cni-dir\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.756765 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756654 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-multus-cni-dir\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.756765 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756743 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-sys\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.756853 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756802 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-systemd-units\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.756897 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756854 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-var-lib-openvswitch\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.756943 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756898 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-cni-bin\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.756943 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756905 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/967d645a-e8ea-4968-bac5-2446d61f1581-iptables-alerter-script\") pod \"iptables-alerter-wxlkl\" (UID: \"967d645a-e8ea-4968-bac5-2446d61f1581\") " pod="openshift-network-operator/iptables-alerter-wxlkl" Apr 24 21:26:39.757020 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.756958 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-etc-sysconfig\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.757071 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757025 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/425e4fdf-8950-4297-b9c8-488b3e610f40-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.757071 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757037 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-etc-sysconfig\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.757071 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757062 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-sys-fs\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.757236 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757091 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-cnibin\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.757236 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757136 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-log-socket\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.757236 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757140 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/425e4fdf-8950-4297-b9c8-488b3e610f40-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.757236 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757164 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-cni-netd\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.757236 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757224 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nw4t\" (UniqueName: \"kubernetes.io/projected/967d645a-e8ea-4968-bac5-2446d61f1581-kube-api-access-6nw4t\") pod \"iptables-alerter-wxlkl\" (UID: \"967d645a-e8ea-4968-bac5-2446d61f1581\") " pod="openshift-network-operator/iptables-alerter-wxlkl" Apr 24 21:26:39.757454 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757245 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-device-dir\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.757454 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757259 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-host-run-netns\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.757454 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757275 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-ovn-node-metrics-cert\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.757454 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757325 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7n47\" (UniqueName: \"kubernetes.io/projected/3fe6ee90-5f50-4532-8f34-d91e4dc1fccd-kube-api-access-b7n47\") pod \"node-resolver-6jctc\" (UID: \"3fe6ee90-5f50-4532-8f34-d91e4dc1fccd\") " pod="openshift-dns/node-resolver-6jctc" Apr 24 21:26:39.757630 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757519 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/425e4fdf-8950-4297-b9c8-488b3e610f40-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.757630 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757588 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-os-release\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.757727 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757701 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-host-var-lib-cni-bin\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.757811 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757781 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-run-netns\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.757927 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757824 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82g7c\" (UniqueName: \"kubernetes.io/projected/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-kube-api-access-82g7c\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.757927 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757853 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-etc-kubernetes\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.757927 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757875 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-var-lib-kubelet\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.758086 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757963 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/425e4fdf-8950-4297-b9c8-488b3e610f40-os-release\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.758086 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758026 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-host-var-lib-kubelet\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.758086 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758054 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-kubelet\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.758275 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758083 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-lib-modules\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.758275 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758119 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/425e4fdf-8950-4297-b9c8-488b3e610f40-os-release\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.758275 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758154 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/425e4fdf-8950-4297-b9c8-488b3e610f40-cnibin\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.758275 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758187 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xf4r\" (UniqueName: \"kubernetes.io/projected/425e4fdf-8950-4297-b9c8-488b3e610f40-kube-api-access-2xf4r\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.758275 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758234 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-host-var-lib-cni-multus\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.758487 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758286 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-lib-modules\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.758487 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758291 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-etc-openvswitch\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.758487 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758329 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.758487 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758361 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-env-overrides\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.758487 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758378 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3fe6ee90-5f50-4532-8f34-d91e4dc1fccd-hosts-file\") pod \"node-resolver-6jctc\" (UID: \"3fe6ee90-5f50-4532-8f34-d91e4dc1fccd\") " pod="openshift-dns/node-resolver-6jctc" Apr 24 21:26:39.758487 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758430 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-var-lib-kubelet\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.758487 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.757985 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-etc-kubernetes\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.758487 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758464 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3fe6ee90-5f50-4532-8f34-d91e4dc1fccd-tmp-dir\") pod \"node-resolver-6jctc\" (UID: \"3fe6ee90-5f50-4532-8f34-d91e4dc1fccd\") " pod="openshift-dns/node-resolver-6jctc" Apr 24 21:26:39.758487 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758448 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3fe6ee90-5f50-4532-8f34-d91e4dc1fccd-hosts-file\") pod \"node-resolver-6jctc\" (UID: \"3fe6ee90-5f50-4532-8f34-d91e4dc1fccd\") " pod="openshift-dns/node-resolver-6jctc" Apr 24 21:26:39.758878 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758499 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/572a1510-69a8-48bf-94b2-311fd0c0d92f-host\") pod \"node-ca-sfdpv\" (UID: \"572a1510-69a8-48bf-94b2-311fd0c0d92f\") " pod="openshift-image-registry/node-ca-sfdpv" Apr 24 21:26:39.758878 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758577 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-etc-sysctl-conf\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.758878 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758598 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/572a1510-69a8-48bf-94b2-311fd0c0d92f-host\") pod \"node-ca-sfdpv\" (UID: \"572a1510-69a8-48bf-94b2-311fd0c0d92f\") " pod="openshift-image-registry/node-ca-sfdpv" Apr 24 21:26:39.758878 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758609 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/01d20a9b-0255-4507-a8b5-862da4147c01-agent-certs\") pod \"konnectivity-agent-tf6b4\" (UID: \"01d20a9b-0255-4507-a8b5-862da4147c01\") " pod="kube-system/konnectivity-agent-tf6b4" Apr 24 21:26:39.758878 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758278 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/425e4fdf-8950-4297-b9c8-488b3e610f40-cnibin\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.758878 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758722 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-etc-sysctl-conf\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.758878 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758723 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-run-ovn\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.758878 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758757 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3fe6ee90-5f50-4532-8f34-d91e4dc1fccd-tmp-dir\") pod \"node-resolver-6jctc\" (UID: \"3fe6ee90-5f50-4532-8f34-d91e4dc1fccd\") " pod="openshift-dns/node-resolver-6jctc" Apr 24 21:26:39.758878 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758792 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-etc-sysctl-d\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.758878 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758821 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-run\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.758878 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-host-run-multus-certs\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.759292 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758908 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-etc-modprobe-d\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.759292 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758933 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c3c7543c-7f18-44b0-b64e-519be8319862-tmp\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.759292 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758938 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-run\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.759292 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758940 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-etc-sysctl-d\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.759292 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758956 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/967d645a-e8ea-4968-bac5-2446d61f1581-host-slash\") pod \"iptables-alerter-wxlkl\" (UID: \"967d645a-e8ea-4968-bac5-2446d61f1581\") " pod="openshift-network-operator/iptables-alerter-wxlkl" Apr 24 21:26:39.759292 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.758980 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs\") pod \"network-metrics-daemon-c4ck8\" (UID: \"52e9a5f8-832a-4f1e-add3-f10bf674757e\") " pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:39.759292 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759004 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-host\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.759292 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759011 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/967d645a-e8ea-4968-bac5-2446d61f1581-host-slash\") pod \"iptables-alerter-wxlkl\" (UID: \"967d645a-e8ea-4968-bac5-2446d61f1581\") " pod="openshift-network-operator/iptables-alerter-wxlkl" Apr 24 21:26:39.759292 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759013 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-etc-modprobe-d\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.759292 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759031 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-multus-conf-dir\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.759292 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759059 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-run-openvswitch\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.759292 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759085 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-ovnkube-script-lib\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.759292 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759073 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3c7543c-7f18-44b0-b64e-519be8319862-host\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.759292 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:39.759138 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:39.759292 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759154 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/572a1510-69a8-48bf-94b2-311fd0c0d92f-serviceca\") pod \"node-ca-sfdpv\" (UID: \"572a1510-69a8-48bf-94b2-311fd0c0d92f\") " pod="openshift-image-registry/node-ca-sfdpv" Apr 24 21:26:39.759292 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759178 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56lfb\" (UniqueName: \"kubernetes.io/projected/52e9a5f8-832a-4f1e-add3-f10bf674757e-kube-api-access-56lfb\") pod \"network-metrics-daemon-c4ck8\" (UID: \"52e9a5f8-832a-4f1e-add3-f10bf674757e\") " pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:39.759292 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:39.759220 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs podName:52e9a5f8-832a-4f1e-add3-f10bf674757e nodeName:}" failed. No retries permitted until 2026-04-24 21:26:40.25918729 +0000 UTC m=+3.031724391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs") pod "network-metrics-daemon-c4ck8" (UID: "52e9a5f8-832a-4f1e-add3-f10bf674757e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:39.759292 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759245 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:26:39.760087 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759268 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/af21b504-2a52-42ab-82d6-71911cc6a655-cni-binary-copy\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.760087 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759307 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-socket-dir\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.760087 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759383 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-multus-socket-dir-parent\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.760087 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759426 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-run-systemd\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.760087 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759459 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/425e4fdf-8950-4297-b9c8-488b3e610f40-system-cni-dir\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.760087 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/425e4fdf-8950-4297-b9c8-488b3e610f40-cni-binary-copy\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.760087 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759518 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/01d20a9b-0255-4507-a8b5-862da4147c01-konnectivity-ca\") pod \"konnectivity-agent-tf6b4\" (UID: \"01d20a9b-0255-4507-a8b5-862da4147c01\") " pod="kube-system/konnectivity-agent-tf6b4" Apr 24 21:26:39.760087 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759538 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/425e4fdf-8950-4297-b9c8-488b3e610f40-system-cni-dir\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.760087 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759546 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-host-run-k8s-cni-cncf-io\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.760087 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759586 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/572a1510-69a8-48bf-94b2-311fd0c0d92f-serviceca\") pod \"node-ca-sfdpv\" (UID: \"572a1510-69a8-48bf-94b2-311fd0c0d92f\") " pod="openshift-image-registry/node-ca-sfdpv" Apr 24 21:26:39.760087 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759592 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-hostroot\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.760087 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.759974 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-node-log\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.760087 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.760071 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-run-ovn-kubernetes\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.760690 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.760146 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfpb7\" (UniqueName: \"kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7\") pod \"network-check-target-mzzxk\" (UID: \"1daccc38-8893-4df5-b7d6-357c27b4e705\") " pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:26:39.760690 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.760204 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c3c7543c-7f18-44b0-b64e-519be8319862-etc-tuned\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.760690 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.760315 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/af21b504-2a52-42ab-82d6-71911cc6a655-multus-daemon-config\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.760690 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.760597 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/425e4fdf-8950-4297-b9c8-488b3e610f40-cni-binary-copy\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.764290 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.764090 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c3c7543c-7f18-44b0-b64e-519be8319862-etc-tuned\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.764529 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.764477 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c3c7543c-7f18-44b0-b64e-519be8319862-tmp\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.769964 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.769936 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g8gp\" (UniqueName: \"kubernetes.io/projected/572a1510-69a8-48bf-94b2-311fd0c0d92f-kube-api-access-9g8gp\") pod \"node-ca-sfdpv\" (UID: \"572a1510-69a8-48bf-94b2-311fd0c0d92f\") " pod="openshift-image-registry/node-ca-sfdpv" Apr 24 21:26:39.772446 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.772422 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xf4r\" (UniqueName: \"kubernetes.io/projected/425e4fdf-8950-4297-b9c8-488b3e610f40-kube-api-access-2xf4r\") pod \"multus-additional-cni-plugins-hg8hv\" (UID: \"425e4fdf-8950-4297-b9c8-488b3e610f40\") " pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.773014 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:39.772990 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:39.773135 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:39.773020 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:39.773135 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:39.773035 2574 projected.go:194] Error preparing data for projected volume kube-api-access-jfpb7 for pod openshift-network-diagnostics/network-check-target-mzzxk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:39.773135 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:39.773123 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7 podName:1daccc38-8893-4df5-b7d6-357c27b4e705 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:40.273085409 +0000 UTC m=+3.045622512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jfpb7" (UniqueName: "kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7") pod "network-check-target-mzzxk" (UID: "1daccc38-8893-4df5-b7d6-357c27b4e705") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:39.773905 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.773869 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-56lfb\" (UniqueName: \"kubernetes.io/projected/52e9a5f8-832a-4f1e-add3-f10bf674757e-kube-api-access-56lfb\") pod \"network-metrics-daemon-c4ck8\" (UID: \"52e9a5f8-832a-4f1e-add3-f10bf674757e\") " pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:39.774071 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.774052 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7n47\" (UniqueName: \"kubernetes.io/projected/3fe6ee90-5f50-4532-8f34-d91e4dc1fccd-kube-api-access-b7n47\") pod \"node-resolver-6jctc\" (UID: \"3fe6ee90-5f50-4532-8f34-d91e4dc1fccd\") " pod="openshift-dns/node-resolver-6jctc" Apr 24 21:26:39.775118 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.775079 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nw4t\" (UniqueName: \"kubernetes.io/projected/967d645a-e8ea-4968-bac5-2446d61f1581-kube-api-access-6nw4t\") pod \"iptables-alerter-wxlkl\" (UID: \"967d645a-e8ea-4968-bac5-2446d61f1581\") " pod="openshift-network-operator/iptables-alerter-wxlkl" Apr 24 21:26:39.775633 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.775617 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhg48\" (UniqueName: \"kubernetes.io/projected/c3c7543c-7f18-44b0-b64e-519be8319862-kube-api-access-jhg48\") pod \"tuned-h4s4s\" (UID: \"c3c7543c-7f18-44b0-b64e-519be8319862\") " pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.861364 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861258 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/af21b504-2a52-42ab-82d6-71911cc6a655-cni-binary-copy\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.861364 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861302 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-socket-dir\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.861556 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861422 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-socket-dir\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.861556 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861422 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-multus-socket-dir-parent\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.861556 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861465 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-run-systemd\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.861556 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861493 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/01d20a9b-0255-4507-a8b5-862da4147c01-konnectivity-ca\") pod \"konnectivity-agent-tf6b4\" (UID: \"01d20a9b-0255-4507-a8b5-862da4147c01\") " pod="kube-system/konnectivity-agent-tf6b4" Apr 24 21:26:39.861556 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861517 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-host-run-k8s-cni-cncf-io\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.861556 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861519 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-multus-socket-dir-parent\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.861556 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861545 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-run-systemd\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.861879 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861561 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-host-run-k8s-cni-cncf-io\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.861879 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861600 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-hostroot\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.861879 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861642 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-node-log\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.861879 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861676 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-run-ovn-kubernetes\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.861879 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861685 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-hostroot\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.861879 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861724 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/af21b504-2a52-42ab-82d6-71911cc6a655-multus-daemon-config\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.861879 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861742 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-node-log\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.861879 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861752 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-registration-dir\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.861879 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861782 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-run-ovn-kubernetes\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.861879 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861782 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-etc-kubernetes\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.861879 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861823 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dktt5\" (UniqueName: \"kubernetes.io/projected/af21b504-2a52-42ab-82d6-71911cc6a655-kube-api-access-dktt5\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.861879 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861830 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-etc-kubernetes\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.861879 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861840 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/af21b504-2a52-42ab-82d6-71911cc6a655-cni-binary-copy\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.861879 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861850 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-slash\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.861879 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861873 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-ovnkube-config\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.862684 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861903 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-registration-dir\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.862684 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861947 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.862684 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861951 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-slash\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.862684 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862022 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/01d20a9b-0255-4507-a8b5-862da4147c01-konnectivity-ca\") pod \"konnectivity-agent-tf6b4\" (UID: \"01d20a9b-0255-4507-a8b5-862da4147c01\") " pod="kube-system/konnectivity-agent-tf6b4" Apr 24 21:26:39.862684 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.861905 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.862684 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862075 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-etc-selinux\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.862684 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862122 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hv52r\" (UniqueName: \"kubernetes.io/projected/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-kube-api-access-hv52r\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.862684 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862158 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-system-cni-dir\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.862684 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862183 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-multus-cni-dir\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.862684 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862210 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-systemd-units\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.862684 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862234 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-var-lib-openvswitch\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.862684 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862258 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-cni-bin\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.862684 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862288 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-sys-fs\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.862684 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-cnibin\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.862684 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862334 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-log-socket\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.862684 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862358 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-cni-netd\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.862684 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862375 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/af21b504-2a52-42ab-82d6-71911cc6a655-multus-daemon-config\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.863470 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862383 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-device-dir\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.863470 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862431 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-etc-selinux\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.863470 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862440 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-host-run-netns\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.863470 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862461 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-ovnkube-config\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.863470 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862477 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-ovn-node-metrics-cert\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.863470 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862481 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-host-run-netns\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.863470 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862480 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-system-cni-dir\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.863470 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862507 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-os-release\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.863470 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862510 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-cni-bin\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.863470 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862535 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-host-var-lib-cni-bin\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.863470 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862529 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-cnibin\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.863470 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862565 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-run-netns\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.863470 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862571 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-os-release\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.863470 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862586 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-multus-cni-dir\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.863470 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862590 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82g7c\" (UniqueName: \"kubernetes.io/projected/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-kube-api-access-82g7c\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.863470 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862593 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-log-socket\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.863470 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862437 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-device-dir\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.863470 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862535 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-systemd-units\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.864141 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862614 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-cni-netd\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.864141 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862638 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-run-netns\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.864141 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862644 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-host-var-lib-kubelet\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.864141 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862640 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-sys-fs\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.864141 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862674 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-var-lib-openvswitch\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.864141 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862686 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-host-var-lib-cni-bin\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.864141 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862695 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-kubelet\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.864141 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862710 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-host-var-lib-kubelet\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.864141 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862779 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-kubelet\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.864141 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862814 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-host-var-lib-cni-multus\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.864141 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862859 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-etc-openvswitch\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.864141 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862889 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-etc-openvswitch\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.864141 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862931 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-host-var-lib-cni-multus\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.864141 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862954 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.864141 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.862983 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-env-overrides\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.864141 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.863014 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.864141 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.863021 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/01d20a9b-0255-4507-a8b5-862da4147c01-agent-certs\") pod \"konnectivity-agent-tf6b4\" (UID: \"01d20a9b-0255-4507-a8b5-862da4147c01\") " pod="kube-system/konnectivity-agent-tf6b4" Apr 24 21:26:39.864796 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.863047 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-run-ovn\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.864796 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.863074 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-host-run-multus-certs\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.864796 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.863129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-multus-conf-dir\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.864796 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.863137 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-run-ovn\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.864796 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.863155 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-run-openvswitch\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.864796 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.863179 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-ovnkube-script-lib\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.864796 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.863210 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-multus-conf-dir\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.864796 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.863179 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/af21b504-2a52-42ab-82d6-71911cc6a655-host-run-multus-certs\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.864796 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.863248 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-run-openvswitch\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.864796 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.863386 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-env-overrides\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.864796 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.863699 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-ovnkube-script-lib\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.865210 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.865156 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-ovn-node-metrics-cert\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.865500 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.865484 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/01d20a9b-0255-4507-a8b5-862da4147c01-agent-certs\") pod \"konnectivity-agent-tf6b4\" (UID: \"01d20a9b-0255-4507-a8b5-862da4147c01\") " pod="kube-system/konnectivity-agent-tf6b4" Apr 24 21:26:39.874580 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.874446 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82g7c\" (UniqueName: \"kubernetes.io/projected/6da7ad0f-d1bd-4849-8159-3fc9d885aaa7-kube-api-access-82g7c\") pod \"ovnkube-node-9jw7k\" (UID: \"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7\") " pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:39.874689 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.874629 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv52r\" (UniqueName: \"kubernetes.io/projected/66f2de9b-cdf8-40c4-8408-03b7b8b66fb7-kube-api-access-hv52r\") pod \"aws-ebs-csi-driver-node-4gtkw\" (UID: \"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:39.874917 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.874783 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dktt5\" (UniqueName: \"kubernetes.io/projected/af21b504-2a52-42ab-82d6-71911cc6a655-kube-api-access-dktt5\") pod \"multus-dqnl7\" (UID: \"af21b504-2a52-42ab-82d6-71911cc6a655\") " pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.949791 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.949756 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" Apr 24 21:26:39.958665 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.958642 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wxlkl" Apr 24 21:26:39.966403 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.966378 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6jctc" Apr 24 21:26:39.971903 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.971874 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sfdpv" Apr 24 21:26:39.978466 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.978445 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hg8hv" Apr 24 21:26:39.985028 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.985010 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dqnl7" Apr 24 21:26:39.992608 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:39.992583 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" Apr 24 21:26:40.000239 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:40.000217 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:26:40.005844 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:40.005823 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tf6b4" Apr 24 21:26:40.266007 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:40.265925 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs\") pod \"network-metrics-daemon-c4ck8\" (UID: \"52e9a5f8-832a-4f1e-add3-f10bf674757e\") " pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:40.266204 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:40.266094 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:40.266282 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:40.266219 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs podName:52e9a5f8-832a-4f1e-add3-f10bf674757e nodeName:}" failed. No retries permitted until 2026-04-24 21:26:41.266166239 +0000 UTC m=+4.038703323 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs") pod "network-metrics-daemon-c4ck8" (UID: "52e9a5f8-832a-4f1e-add3-f10bf674757e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:40.361342 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:40.361212 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf21b504_2a52_42ab_82d6_71911cc6a655.slice/crio-b24f638a8dc8a56a5097958350482f02a0414f4392d1ffb5a43c790ba6ec2996 WatchSource:0}: Error finding container b24f638a8dc8a56a5097958350482f02a0414f4392d1ffb5a43c790ba6ec2996: Status 404 returned error can't find the container with id b24f638a8dc8a56a5097958350482f02a0414f4392d1ffb5a43c790ba6ec2996 Apr 24 21:26:40.362584 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:40.362555 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01d20a9b_0255_4507_a8b5_862da4147c01.slice/crio-2f86ccc0a6c82c219206cf26a5ce693cfbfcd879c04c8cd4a9324364c50cf828 WatchSource:0}: Error finding container 2f86ccc0a6c82c219206cf26a5ce693cfbfcd879c04c8cd4a9324364c50cf828: Status 404 returned error can't find the container with id 2f86ccc0a6c82c219206cf26a5ce693cfbfcd879c04c8cd4a9324364c50cf828 Apr 24 21:26:40.366242 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:40.366214 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfpb7\" (UniqueName: \"kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7\") pod \"network-check-target-mzzxk\" (UID: \"1daccc38-8893-4df5-b7d6-357c27b4e705\") " pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:26:40.366343 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:40.366328 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:40.366399 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:40.366342 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod967d645a_e8ea_4968_bac5_2446d61f1581.slice/crio-22b10aef1b32716fb5af28cd7c6aa996e4db375754c9d0d0336e3f127a263eb6 WatchSource:0}: Error finding container 22b10aef1b32716fb5af28cd7c6aa996e4db375754c9d0d0336e3f127a263eb6: Status 404 returned error can't find the container with id 22b10aef1b32716fb5af28cd7c6aa996e4db375754c9d0d0336e3f127a263eb6 Apr 24 21:26:40.366399 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:40.366351 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:40.366399 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:40.366364 2574 projected.go:194] Error preparing data for projected volume kube-api-access-jfpb7 for pod openshift-network-diagnostics/network-check-target-mzzxk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:40.366531 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:40.366415 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7 podName:1daccc38-8893-4df5-b7d6-357c27b4e705 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:41.36639578 +0000 UTC m=+4.138932875 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jfpb7" (UniqueName: "kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7") pod "network-check-target-mzzxk" (UID: "1daccc38-8893-4df5-b7d6-357c27b4e705") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:40.367329 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:40.367308 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66f2de9b_cdf8_40c4_8408_03b7b8b66fb7.slice/crio-13747b26962efc10124f8109f945c3634f8d23d21df16d7512441cec26877a20 WatchSource:0}: Error finding container 13747b26962efc10124f8109f945c3634f8d23d21df16d7512441cec26877a20: Status 404 returned error can't find the container with id 13747b26962efc10124f8109f945c3634f8d23d21df16d7512441cec26877a20 Apr 24 21:26:40.368126 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:40.368083 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod572a1510_69a8_48bf_94b2_311fd0c0d92f.slice/crio-d5160381f7b3ac9ca2320111e9fee10eec39354c03af99bee91c231b225f12b9 WatchSource:0}: Error finding container d5160381f7b3ac9ca2320111e9fee10eec39354c03af99bee91c231b225f12b9: Status 404 returned error can't find the container with id d5160381f7b3ac9ca2320111e9fee10eec39354c03af99bee91c231b225f12b9 Apr 24 21:26:40.368976 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:40.368950 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fe6ee90_5f50_4532_8f34_d91e4dc1fccd.slice/crio-1ebede6e2f45c13d7fb3659fbe60d37550e94d6fa643bdb98c0cbfb99ca63e52 WatchSource:0}: Error finding container 1ebede6e2f45c13d7fb3659fbe60d37550e94d6fa643bdb98c0cbfb99ca63e52: Status 404 returned error can't find the container with id 1ebede6e2f45c13d7fb3659fbe60d37550e94d6fa643bdb98c0cbfb99ca63e52 Apr 24 21:26:40.370984 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:40.370932 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6da7ad0f_d1bd_4849_8159_3fc9d885aaa7.slice/crio-ae7c8517d7de5e1b13c9bacfcf0d794cf563b070978a788567761463587d0651 WatchSource:0}: Error finding container ae7c8517d7de5e1b13c9bacfcf0d794cf563b070978a788567761463587d0651: Status 404 returned error can't find the container with id ae7c8517d7de5e1b13c9bacfcf0d794cf563b070978a788567761463587d0651 Apr 24 21:26:40.371410 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:40.371387 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod425e4fdf_8950_4297_b9c8_488b3e610f40.slice/crio-02d659f072441bb4a17963c0edeaa61601def948bea0456be227aca8fc395bd4 WatchSource:0}: Error finding container 02d659f072441bb4a17963c0edeaa61601def948bea0456be227aca8fc395bd4: Status 404 returned error can't find the container with id 02d659f072441bb4a17963c0edeaa61601def948bea0456be227aca8fc395bd4 Apr 24 21:26:40.372072 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:26:40.372027 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3c7543c_7f18_44b0_b64e_519be8319862.slice/crio-12ba1c173710213d0a1fa7ed65a6cfa5563d7a4a14d3b6df9f416a477ebc20ee WatchSource:0}: Error finding container 12ba1c173710213d0a1fa7ed65a6cfa5563d7a4a14d3b6df9f416a477ebc20ee: Status 404 returned error can't find the container with id 12ba1c173710213d0a1fa7ed65a6cfa5563d7a4a14d3b6df9f416a477ebc20ee Apr 24 21:26:40.684121 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:40.683981 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:21:38 +0000 UTC" deadline="2028-02-04 05:58:40.298991984 +0000 UTC" Apr 24 21:26:40.684121 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:40.684021 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15608h31m59.614974738s" Apr 24 21:26:40.796991 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:40.796951 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hg8hv" event={"ID":"425e4fdf-8950-4297-b9c8-488b3e610f40","Type":"ContainerStarted","Data":"02d659f072441bb4a17963c0edeaa61601def948bea0456be227aca8fc395bd4"} Apr 24 21:26:40.798516 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:40.798481 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" event={"ID":"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7","Type":"ContainerStarted","Data":"ae7c8517d7de5e1b13c9bacfcf0d794cf563b070978a788567761463587d0651"} Apr 24 21:26:40.802038 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:40.802012 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sfdpv" event={"ID":"572a1510-69a8-48bf-94b2-311fd0c0d92f","Type":"ContainerStarted","Data":"d5160381f7b3ac9ca2320111e9fee10eec39354c03af99bee91c231b225f12b9"} Apr 24 21:26:40.803838 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:40.803792 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" event={"ID":"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7","Type":"ContainerStarted","Data":"13747b26962efc10124f8109f945c3634f8d23d21df16d7512441cec26877a20"} Apr 24 21:26:40.805285 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:40.805237 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wxlkl" event={"ID":"967d645a-e8ea-4968-bac5-2446d61f1581","Type":"ContainerStarted","Data":"22b10aef1b32716fb5af28cd7c6aa996e4db375754c9d0d0336e3f127a263eb6"} Apr 24 21:26:40.808008 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:40.807944 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tf6b4" event={"ID":"01d20a9b-0255-4507-a8b5-862da4147c01","Type":"ContainerStarted","Data":"2f86ccc0a6c82c219206cf26a5ce693cfbfcd879c04c8cd4a9324364c50cf828"} Apr 24 21:26:40.815216 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:40.815183 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-209.ec2.internal" event={"ID":"fa9ef8c20bc608bfc7fe350ef3c4a29b","Type":"ContainerStarted","Data":"395748944e40f4dc643ffcd8fdcc1e17dc966ff37b8ede0c15f100679750810a"} Apr 24 21:26:40.827717 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:40.825121 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6jctc" event={"ID":"3fe6ee90-5f50-4532-8f34-d91e4dc1fccd","Type":"ContainerStarted","Data":"1ebede6e2f45c13d7fb3659fbe60d37550e94d6fa643bdb98c0cbfb99ca63e52"} Apr 24 21:26:40.832206 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:40.832181 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dqnl7" event={"ID":"af21b504-2a52-42ab-82d6-71911cc6a655","Type":"ContainerStarted","Data":"b24f638a8dc8a56a5097958350482f02a0414f4392d1ffb5a43c790ba6ec2996"} Apr 24 21:26:40.840073 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:40.840048 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" event={"ID":"c3c7543c-7f18-44b0-b64e-519be8319862","Type":"ContainerStarted","Data":"12ba1c173710213d0a1fa7ed65a6cfa5563d7a4a14d3b6df9f416a477ebc20ee"} Apr 24 21:26:41.277475 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:41.277416 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs\") pod \"network-metrics-daemon-c4ck8\" (UID: \"52e9a5f8-832a-4f1e-add3-f10bf674757e\") " pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:41.277639 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:41.277580 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:41.277720 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:41.277648 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs podName:52e9a5f8-832a-4f1e-add3-f10bf674757e nodeName:}" failed. No retries permitted until 2026-04-24 21:26:43.277627432 +0000 UTC m=+6.050164520 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs") pod "network-metrics-daemon-c4ck8" (UID: "52e9a5f8-832a-4f1e-add3-f10bf674757e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:41.378320 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:41.378281 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfpb7\" (UniqueName: \"kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7\") pod \"network-check-target-mzzxk\" (UID: \"1daccc38-8893-4df5-b7d6-357c27b4e705\") " pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:26:41.378489 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:41.378454 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:41.378489 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:41.378476 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:41.378489 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:41.378490 2574 projected.go:194] Error preparing data for projected volume kube-api-access-jfpb7 for pod openshift-network-diagnostics/network-check-target-mzzxk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:41.378661 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:41.378549 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7 podName:1daccc38-8893-4df5-b7d6-357c27b4e705 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:43.378530415 +0000 UTC m=+6.151067499 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jfpb7" (UniqueName: "kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7") pod "network-check-target-mzzxk" (UID: "1daccc38-8893-4df5-b7d6-357c27b4e705") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:41.782072 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:41.780916 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:26:41.782072 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:41.781082 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mzzxk" podUID="1daccc38-8893-4df5-b7d6-357c27b4e705" Apr 24 21:26:41.782072 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:41.781830 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:41.782072 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:41.782016 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4ck8" podUID="52e9a5f8-832a-4f1e-add3-f10bf674757e" Apr 24 21:26:41.874935 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:41.874897 2574 generic.go:358] "Generic (PLEG): container finished" podID="ef10e2637f5c12380b7bcf89d82ff9d2" containerID="7cd2d98578eae7c9503ffcd0db172254aa6ac65b0e4df578453b66947d37eaca" exitCode=0 Apr 24 21:26:41.876010 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:41.875984 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal" event={"ID":"ef10e2637f5c12380b7bcf89d82ff9d2","Type":"ContainerDied","Data":"7cd2d98578eae7c9503ffcd0db172254aa6ac65b0e4df578453b66947d37eaca"} Apr 24 21:26:41.892522 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:41.891929 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-209.ec2.internal" podStartSLOduration=2.89189507 podStartE2EDuration="2.89189507s" podCreationTimestamp="2026-04-24 21:26:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:26:40.829763106 +0000 UTC m=+3.602300213" watchObservedRunningTime="2026-04-24 21:26:41.89189507 +0000 UTC m=+4.664432175" Apr 24 21:26:42.889020 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:42.888194 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal" event={"ID":"ef10e2637f5c12380b7bcf89d82ff9d2","Type":"ContainerStarted","Data":"7ce2c211d94d6d721267e1a02c4b5ecb2eccc23fed3e91f6de6f3121e618b150"} Apr 24 21:26:43.293521 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:43.293426 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs\") pod \"network-metrics-daemon-c4ck8\" (UID: \"52e9a5f8-832a-4f1e-add3-f10bf674757e\") " pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:43.293668 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:43.293621 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:43.293787 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:43.293689 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs podName:52e9a5f8-832a-4f1e-add3-f10bf674757e nodeName:}" failed. No retries permitted until 2026-04-24 21:26:47.293670167 +0000 UTC m=+10.066207270 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs") pod "network-metrics-daemon-c4ck8" (UID: "52e9a5f8-832a-4f1e-add3-f10bf674757e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:43.394755 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:43.394710 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfpb7\" (UniqueName: \"kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7\") pod \"network-check-target-mzzxk\" (UID: \"1daccc38-8893-4df5-b7d6-357c27b4e705\") " pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:26:43.394920 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:43.394887 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:43.394920 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:43.394912 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:43.395032 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:43.394925 2574 projected.go:194] Error preparing data for projected volume kube-api-access-jfpb7 for pod openshift-network-diagnostics/network-check-target-mzzxk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:43.395032 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:43.394992 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7 podName:1daccc38-8893-4df5-b7d6-357c27b4e705 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:47.394972426 +0000 UTC m=+10.167509523 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-jfpb7" (UniqueName: "kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7") pod "network-check-target-mzzxk" (UID: "1daccc38-8893-4df5-b7d6-357c27b4e705") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:43.779799 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:43.779717 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:26:43.779962 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:43.779855 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mzzxk" podUID="1daccc38-8893-4df5-b7d6-357c27b4e705" Apr 24 21:26:43.780237 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:43.780195 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:43.780545 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:43.780330 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4ck8" podUID="52e9a5f8-832a-4f1e-add3-f10bf674757e" Apr 24 21:26:45.780510 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:45.780478 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:45.781016 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:45.780614 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4ck8" podUID="52e9a5f8-832a-4f1e-add3-f10bf674757e" Apr 24 21:26:45.781016 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:45.780428 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:26:45.781144 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:45.781066 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mzzxk" podUID="1daccc38-8893-4df5-b7d6-357c27b4e705" Apr 24 21:26:47.330127 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:47.330073 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs\") pod \"network-metrics-daemon-c4ck8\" (UID: \"52e9a5f8-832a-4f1e-add3-f10bf674757e\") " pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:47.330631 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:47.330258 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:47.330631 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:47.330340 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs podName:52e9a5f8-832a-4f1e-add3-f10bf674757e nodeName:}" failed. No retries permitted until 2026-04-24 21:26:55.330318321 +0000 UTC m=+18.102855418 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs") pod "network-metrics-daemon-c4ck8" (UID: "52e9a5f8-832a-4f1e-add3-f10bf674757e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:47.430652 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:47.430608 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfpb7\" (UniqueName: \"kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7\") pod \"network-check-target-mzzxk\" (UID: \"1daccc38-8893-4df5-b7d6-357c27b4e705\") " pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:26:47.430870 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:47.430819 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:47.430870 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:47.430847 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:47.430870 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:47.430862 2574 projected.go:194] Error preparing data for projected volume kube-api-access-jfpb7 for pod openshift-network-diagnostics/network-check-target-mzzxk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:47.431065 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:47.430927 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7 podName:1daccc38-8893-4df5-b7d6-357c27b4e705 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:55.430906346 +0000 UTC m=+18.203443446 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-jfpb7" (UniqueName: "kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7") pod "network-check-target-mzzxk" (UID: "1daccc38-8893-4df5-b7d6-357c27b4e705") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:47.780752 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:47.780712 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:47.780915 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:47.780843 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4ck8" podUID="52e9a5f8-832a-4f1e-add3-f10bf674757e" Apr 24 21:26:47.781303 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:47.781270 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:26:47.781423 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:47.781376 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mzzxk" podUID="1daccc38-8893-4df5-b7d6-357c27b4e705" Apr 24 21:26:49.779529 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:49.779433 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:26:49.779529 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:49.779463 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:49.780017 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:49.779570 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mzzxk" podUID="1daccc38-8893-4df5-b7d6-357c27b4e705" Apr 24 21:26:49.780017 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:49.779745 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4ck8" podUID="52e9a5f8-832a-4f1e-add3-f10bf674757e" Apr 24 21:26:51.779650 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:51.779610 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:51.780089 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:51.779777 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4ck8" podUID="52e9a5f8-832a-4f1e-add3-f10bf674757e" Apr 24 21:26:51.780089 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:51.779823 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:26:51.780089 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:51.779932 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mzzxk" podUID="1daccc38-8893-4df5-b7d6-357c27b4e705" Apr 24 21:26:53.779129 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:53.779080 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:26:53.779609 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:53.779153 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:53.779609 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:53.779213 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mzzxk" podUID="1daccc38-8893-4df5-b7d6-357c27b4e705" Apr 24 21:26:53.779609 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:53.779275 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4ck8" podUID="52e9a5f8-832a-4f1e-add3-f10bf674757e" Apr 24 21:26:55.384840 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:55.384803 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs\") pod \"network-metrics-daemon-c4ck8\" (UID: \"52e9a5f8-832a-4f1e-add3-f10bf674757e\") " pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:55.385286 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:55.384920 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:55.385286 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:55.384979 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs podName:52e9a5f8-832a-4f1e-add3-f10bf674757e nodeName:}" failed. No retries permitted until 2026-04-24 21:27:11.384962707 +0000 UTC m=+34.157499807 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs") pod "network-metrics-daemon-c4ck8" (UID: "52e9a5f8-832a-4f1e-add3-f10bf674757e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:55.485931 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:55.485893 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfpb7\" (UniqueName: \"kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7\") pod \"network-check-target-mzzxk\" (UID: \"1daccc38-8893-4df5-b7d6-357c27b4e705\") " pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:26:55.486140 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:55.486076 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:55.486140 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:55.486112 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:55.486140 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:55.486136 2574 projected.go:194] Error preparing data for projected volume kube-api-access-jfpb7 for pod openshift-network-diagnostics/network-check-target-mzzxk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:55.486312 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:55.486199 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7 podName:1daccc38-8893-4df5-b7d6-357c27b4e705 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:11.486179054 +0000 UTC m=+34.258716152 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jfpb7" (UniqueName: "kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7") pod "network-check-target-mzzxk" (UID: "1daccc38-8893-4df5-b7d6-357c27b4e705") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:55.587736 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:55.587685 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-209.ec2.internal" podStartSLOduration=16.587668994 podStartE2EDuration="16.587668994s" podCreationTimestamp="2026-04-24 21:26:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:26:42.903716018 +0000 UTC m=+5.676253123" watchObservedRunningTime="2026-04-24 21:26:55.587668994 +0000 UTC m=+18.360206099" Apr 24 21:26:55.588268 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:55.588249 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-h6cws"] Apr 24 21:26:55.642633 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:55.642561 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:26:55.642787 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:55.642636 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h6cws" podUID="500cc7d2-1561-40b2-956f-6e2b94ec6ebc" Apr 24 21:26:55.779488 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:55.779451 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:55.779662 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:55.779452 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:26:55.779662 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:55.779614 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4ck8" podUID="52e9a5f8-832a-4f1e-add3-f10bf674757e" Apr 24 21:26:55.779779 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:55.779756 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mzzxk" podUID="1daccc38-8893-4df5-b7d6-357c27b4e705" Apr 24 21:26:55.787603 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:55.787571 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret\") pod \"global-pull-secret-syncer-h6cws\" (UID: \"500cc7d2-1561-40b2-956f-6e2b94ec6ebc\") " pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:26:55.787755 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:55.787629 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-dbus\") pod \"global-pull-secret-syncer-h6cws\" (UID: \"500cc7d2-1561-40b2-956f-6e2b94ec6ebc\") " pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:26:55.787755 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:55.787698 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-kubelet-config\") pod \"global-pull-secret-syncer-h6cws\" (UID: \"500cc7d2-1561-40b2-956f-6e2b94ec6ebc\") " pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:26:55.888623 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:55.888592 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-kubelet-config\") pod \"global-pull-secret-syncer-h6cws\" (UID: \"500cc7d2-1561-40b2-956f-6e2b94ec6ebc\") " pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:26:55.888836 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:55.888662 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret\") pod \"global-pull-secret-syncer-h6cws\" (UID: \"500cc7d2-1561-40b2-956f-6e2b94ec6ebc\") " pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:26:55.888836 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:55.888698 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-dbus\") pod \"global-pull-secret-syncer-h6cws\" (UID: \"500cc7d2-1561-40b2-956f-6e2b94ec6ebc\") " pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:26:55.888836 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:55.888710 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-kubelet-config\") pod \"global-pull-secret-syncer-h6cws\" (UID: \"500cc7d2-1561-40b2-956f-6e2b94ec6ebc\") " pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:26:55.888836 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:55.888808 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:55.889009 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:55.888851 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-dbus\") pod \"global-pull-secret-syncer-h6cws\" (UID: \"500cc7d2-1561-40b2-956f-6e2b94ec6ebc\") " pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:26:55.889009 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:55.888869 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret podName:500cc7d2-1561-40b2-956f-6e2b94ec6ebc nodeName:}" failed. No retries permitted until 2026-04-24 21:26:56.388849038 +0000 UTC m=+19.161386136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret") pod "global-pull-secret-syncer-h6cws" (UID: "500cc7d2-1561-40b2-956f-6e2b94ec6ebc") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:56.392032 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:56.391992 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret\") pod \"global-pull-secret-syncer-h6cws\" (UID: \"500cc7d2-1561-40b2-956f-6e2b94ec6ebc\") " pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:26:56.392499 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:56.392176 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:56.392499 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:56.392253 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret podName:500cc7d2-1561-40b2-956f-6e2b94ec6ebc nodeName:}" failed. No retries permitted until 2026-04-24 21:26:57.392233633 +0000 UTC m=+20.164770773 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret") pod "global-pull-secret-syncer-h6cws" (UID: "500cc7d2-1561-40b2-956f-6e2b94ec6ebc") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:57.398787 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.398586 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret\") pod \"global-pull-secret-syncer-h6cws\" (UID: \"500cc7d2-1561-40b2-956f-6e2b94ec6ebc\") " pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:26:57.399375 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:57.398854 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:57.399375 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:57.398935 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret podName:500cc7d2-1561-40b2-956f-6e2b94ec6ebc nodeName:}" failed. No retries permitted until 2026-04-24 21:26:59.398917009 +0000 UTC m=+22.171454107 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret") pod "global-pull-secret-syncer-h6cws" (UID: "500cc7d2-1561-40b2-956f-6e2b94ec6ebc") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:57.780597 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.780382 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:26:57.780702 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.780440 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:26:57.780702 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:57.780649 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mzzxk" podUID="1daccc38-8893-4df5-b7d6-357c27b4e705" Apr 24 21:26:57.780819 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.780462 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:57.780869 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:57.780839 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h6cws" podUID="500cc7d2-1561-40b2-956f-6e2b94ec6ebc" Apr 24 21:26:57.780982 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:57.780957 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4ck8" podUID="52e9a5f8-832a-4f1e-add3-f10bf674757e" Apr 24 21:26:57.918656 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.918629 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 21:26:57.918970 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.918944 2574 generic.go:358] "Generic (PLEG): container finished" podID="6da7ad0f-d1bd-4849-8159-3fc9d885aaa7" containerID="5344161e6ed149d06ecb7873d5aaf966f25753d78f6d2b7693f6f5ead80ee951" exitCode=1 Apr 24 21:26:57.919074 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.918987 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" event={"ID":"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7","Type":"ContainerStarted","Data":"636fb73de62f26c15ba6d4cc88b5928acbeb8d4ad1794b00986df061bed36442"} Apr 24 21:26:57.919074 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.919019 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" event={"ID":"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7","Type":"ContainerStarted","Data":"5b6a65c5222936c2e78908477db339b7bd891302e20ec38287e7beddc128f5e5"} Apr 24 21:26:57.919074 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.919033 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" event={"ID":"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7","Type":"ContainerStarted","Data":"f3eddadc737aebc110566a489bf35987a0c6010dfc31602187ebe34fae6c0201"} Apr 24 21:26:57.919074 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.919042 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" event={"ID":"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7","Type":"ContainerDied","Data":"5344161e6ed149d06ecb7873d5aaf966f25753d78f6d2b7693f6f5ead80ee951"} Apr 24 21:26:57.919074 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.919051 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" event={"ID":"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7","Type":"ContainerStarted","Data":"7149759ea8430d3255a9056ca5025ba41140299cf68725d4c52b264fb307cc3a"} Apr 24 21:26:57.920639 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.920605 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sfdpv" event={"ID":"572a1510-69a8-48bf-94b2-311fd0c0d92f","Type":"ContainerStarted","Data":"79ced9a7c8b03f9b0e4953d20254f246e9983506a479d975232b50d810684a4c"} Apr 24 21:26:57.922480 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.922373 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" event={"ID":"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7","Type":"ContainerStarted","Data":"d4a6cfa639ef34c7abe58ec47a62c0eb74820e2d805805835c04a708c34ccce3"} Apr 24 21:26:57.924037 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.924012 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tf6b4" event={"ID":"01d20a9b-0255-4507-a8b5-862da4147c01","Type":"ContainerStarted","Data":"4e513c567b05dcffe7838622aa2355bbb6ef285f6dbb224d99a960a828a0c301"} Apr 24 21:26:57.925480 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.925447 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6jctc" event={"ID":"3fe6ee90-5f50-4532-8f34-d91e4dc1fccd","Type":"ContainerStarted","Data":"36132ef4438bfb18789a93390e3ce1ce807704a8028e1a384a0bd008fd05218b"} Apr 24 21:26:57.927245 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.927220 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dqnl7" event={"ID":"af21b504-2a52-42ab-82d6-71911cc6a655","Type":"ContainerStarted","Data":"caccaec20d112cfab83af18e780fd0a95c05f3f2ebb60487cd5775f7777ee877"} Apr 24 21:26:57.929391 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.929362 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" event={"ID":"c3c7543c-7f18-44b0-b64e-519be8319862","Type":"ContainerStarted","Data":"bc9b08cdbe8bc66401450b793215dac96a9e1bd8280a09049a24a7d9d94ea993"} Apr 24 21:26:57.930991 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.930966 2574 generic.go:358] "Generic (PLEG): container finished" podID="425e4fdf-8950-4297-b9c8-488b3e610f40" containerID="e0ba591640e4375ee22b80a921ae902279707019b3162f0906eadcb3da087867" exitCode=0 Apr 24 21:26:57.931091 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.930997 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hg8hv" event={"ID":"425e4fdf-8950-4297-b9c8-488b3e610f40","Type":"ContainerDied","Data":"e0ba591640e4375ee22b80a921ae902279707019b3162f0906eadcb3da087867"} Apr 24 21:26:57.955523 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.955474 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sfdpv" podStartSLOduration=4.06762025 podStartE2EDuration="20.955457094s" podCreationTimestamp="2026-04-24 21:26:37 +0000 UTC" firstStartedPulling="2026-04-24 21:26:40.370759134 +0000 UTC m=+3.143296220" lastFinishedPulling="2026-04-24 21:26:57.25859597 +0000 UTC m=+20.031133064" observedRunningTime="2026-04-24 21:26:57.949864832 +0000 UTC m=+20.722401935" watchObservedRunningTime="2026-04-24 21:26:57.955457094 +0000 UTC m=+20.727994190" Apr 24 21:26:57.967845 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.967810 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-tf6b4" podStartSLOduration=11.998086495 podStartE2EDuration="20.967797911s" podCreationTimestamp="2026-04-24 21:26:37 +0000 UTC" firstStartedPulling="2026-04-24 21:26:40.364915956 +0000 UTC m=+3.137453042" lastFinishedPulling="2026-04-24 21:26:49.334627361 +0000 UTC m=+12.107164458" observedRunningTime="2026-04-24 21:26:57.967763647 +0000 UTC m=+20.740300750" watchObservedRunningTime="2026-04-24 21:26:57.967797911 +0000 UTC m=+20.740334995" Apr 24 21:26:57.986282 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:57.986241 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-h4s4s" podStartSLOduration=4.100410829 podStartE2EDuration="20.98622853s" podCreationTimestamp="2026-04-24 21:26:37 +0000 UTC" firstStartedPulling="2026-04-24 21:26:40.37441325 +0000 UTC m=+3.146950331" lastFinishedPulling="2026-04-24 21:26:57.26023093 +0000 UTC m=+20.032768032" observedRunningTime="2026-04-24 21:26:57.986198042 +0000 UTC m=+20.758735146" watchObservedRunningTime="2026-04-24 21:26:57.98622853 +0000 UTC m=+20.758765633" Apr 24 21:26:58.003401 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:58.003359 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6jctc" podStartSLOduration=4.324505735 podStartE2EDuration="21.003347149s" podCreationTimestamp="2026-04-24 21:26:37 +0000 UTC" firstStartedPulling="2026-04-24 21:26:40.371238282 +0000 UTC m=+3.143775363" lastFinishedPulling="2026-04-24 21:26:57.050079669 +0000 UTC m=+19.822616777" observedRunningTime="2026-04-24 21:26:58.003176176 +0000 UTC m=+20.775713282" watchObservedRunningTime="2026-04-24 21:26:58.003347149 +0000 UTC m=+20.775884251" Apr 24 21:26:58.045882 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:58.045757 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dqnl7" podStartSLOduration=4.124448253 podStartE2EDuration="21.045741467s" podCreationTimestamp="2026-04-24 21:26:37 +0000 UTC" firstStartedPulling="2026-04-24 21:26:40.363658587 +0000 UTC m=+3.136195671" lastFinishedPulling="2026-04-24 21:26:57.284951798 +0000 UTC m=+20.057488885" observedRunningTime="2026-04-24 21:26:58.0454451 +0000 UTC m=+20.817982215" watchObservedRunningTime="2026-04-24 21:26:58.045741467 +0000 UTC m=+20.818278572" Apr 24 21:26:58.800663 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:58.800508 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:26:58.935621 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:58.935583 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 21:26:58.935969 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:58.935937 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" event={"ID":"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7","Type":"ContainerStarted","Data":"0fefce63def7d5f3e88af53d152f7f305ad2a9f5ed963777ef10c417bb8f2f33"} Apr 24 21:26:58.937804 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:58.937779 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" event={"ID":"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7","Type":"ContainerStarted","Data":"d1bc9c1ab6371ca2e62614a3226825a32f0eaa17bc81a0d284bd22a080bef26d"} Apr 24 21:26:58.939177 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:58.939147 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wxlkl" event={"ID":"967d645a-e8ea-4968-bac5-2446d61f1581","Type":"ContainerStarted","Data":"80978f6c517f4854230aae5bacf3460d4bc62d9549833b8a0b1e191a80c76852"} Apr 24 21:26:58.957152 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:58.957091 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-wxlkl" podStartSLOduration=5.066635462 podStartE2EDuration="21.957077525s" podCreationTimestamp="2026-04-24 21:26:37 +0000 UTC" firstStartedPulling="2026-04-24 21:26:40.368232084 +0000 UTC m=+3.140769165" lastFinishedPulling="2026-04-24 21:26:57.258674131 +0000 UTC m=+20.031211228" observedRunningTime="2026-04-24 21:26:58.956721566 +0000 UTC m=+21.729258679" watchObservedRunningTime="2026-04-24 21:26:58.957077525 +0000 UTC m=+21.729614627" Apr 24 21:26:59.417520 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:59.417416 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret\") pod \"global-pull-secret-syncer-h6cws\" (UID: \"500cc7d2-1561-40b2-956f-6e2b94ec6ebc\") " pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:26:59.417685 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:59.417581 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:59.417685 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:59.417659 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret podName:500cc7d2-1561-40b2-956f-6e2b94ec6ebc nodeName:}" failed. No retries permitted until 2026-04-24 21:27:03.417634685 +0000 UTC m=+26.190171766 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret") pod "global-pull-secret-syncer-h6cws" (UID: "500cc7d2-1561-40b2-956f-6e2b94ec6ebc") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:59.717241 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:59.717142 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:26:58.800645075Z","UUID":"69d78ed2-3bb6-483e-ae06-98f23411b211","Handler":null,"Name":"","Endpoint":""} Apr 24 21:26:59.718898 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:59.718874 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:26:59.719020 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:59.718908 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:26:59.779895 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:59.779865 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:26:59.779895 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:59.779886 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:26:59.779895 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:26:59.779865 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:26:59.780204 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:59.779981 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4ck8" podUID="52e9a5f8-832a-4f1e-add3-f10bf674757e" Apr 24 21:26:59.780204 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:59.780045 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mzzxk" podUID="1daccc38-8893-4df5-b7d6-357c27b4e705" Apr 24 21:26:59.780204 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:26:59.780119 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h6cws" podUID="500cc7d2-1561-40b2-956f-6e2b94ec6ebc" Apr 24 21:27:00.946284 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:00.946254 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 21:27:00.946853 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:00.946668 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" event={"ID":"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7","Type":"ContainerStarted","Data":"76b893abe7ca076687bc1fc2454906f069d09cd8bc514faf57131028a47404f4"} Apr 24 21:27:00.948574 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:00.948547 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" event={"ID":"66f2de9b-cdf8-40c4-8408-03b7b8b66fb7","Type":"ContainerStarted","Data":"cce7ee3ab52afb690c08a2b81b113bcad4d313551626988cf5a5f43ee8067da1"} Apr 24 21:27:00.974697 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:00.974642 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4gtkw" podStartSLOduration=4.26853787 podStartE2EDuration="23.974623868s" podCreationTimestamp="2026-04-24 21:26:37 +0000 UTC" firstStartedPulling="2026-04-24 21:26:40.369082404 +0000 UTC m=+3.141619491" lastFinishedPulling="2026-04-24 21:27:00.075168404 +0000 UTC m=+22.847705489" observedRunningTime="2026-04-24 21:27:00.974072913 +0000 UTC m=+23.746610017" watchObservedRunningTime="2026-04-24 21:27:00.974623868 +0000 UTC m=+23.747160970" Apr 24 21:27:01.697724 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:01.697690 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-tf6b4" Apr 24 21:27:01.698388 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:01.698363 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-tf6b4" Apr 24 21:27:01.779530 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:01.779489 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:27:01.779687 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:01.779491 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:27:01.779687 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:01.779620 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:27:01.779687 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:01.779620 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mzzxk" podUID="1daccc38-8893-4df5-b7d6-357c27b4e705" Apr 24 21:27:01.779798 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:01.779731 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4ck8" podUID="52e9a5f8-832a-4f1e-add3-f10bf674757e" Apr 24 21:27:01.779798 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:01.779779 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h6cws" podUID="500cc7d2-1561-40b2-956f-6e2b94ec6ebc" Apr 24 21:27:01.950684 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:01.950601 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-tf6b4" Apr 24 21:27:01.951281 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:01.951196 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-tf6b4" Apr 24 21:27:02.953581 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:02.953401 2574 generic.go:358] "Generic (PLEG): container finished" podID="425e4fdf-8950-4297-b9c8-488b3e610f40" containerID="61ed398ba8d9359d5549130f6d25838014744bc7ac398aa2d97c3de2bfb20261" exitCode=0 Apr 24 21:27:02.954035 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:02.953477 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hg8hv" event={"ID":"425e4fdf-8950-4297-b9c8-488b3e610f40","Type":"ContainerDied","Data":"61ed398ba8d9359d5549130f6d25838014744bc7ac398aa2d97c3de2bfb20261"} Apr 24 21:27:02.956692 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:02.956677 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 21:27:02.957030 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:02.956999 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" event={"ID":"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7","Type":"ContainerStarted","Data":"da8149914c4a888e396b166f4d05eabd5b77813ff36c00acb9b83c5e9f87d419"} Apr 24 21:27:02.957389 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:02.957371 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:27:02.957490 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:02.957397 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:27:02.957548 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:02.957503 2574 scope.go:117] "RemoveContainer" containerID="5344161e6ed149d06ecb7873d5aaf966f25753d78f6d2b7693f6f5ead80ee951" Apr 24 21:27:02.972840 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:02.972823 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:27:03.450655 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:03.450578 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret\") pod \"global-pull-secret-syncer-h6cws\" (UID: \"500cc7d2-1561-40b2-956f-6e2b94ec6ebc\") " pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:27:03.450803 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:03.450711 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:03.450803 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:03.450783 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret podName:500cc7d2-1561-40b2-956f-6e2b94ec6ebc nodeName:}" failed. No retries permitted until 2026-04-24 21:27:11.450761009 +0000 UTC m=+34.223298113 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret") pod "global-pull-secret-syncer-h6cws" (UID: "500cc7d2-1561-40b2-956f-6e2b94ec6ebc") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:03.779980 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:03.779952 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:27:03.780126 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:03.779988 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:27:03.780126 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:03.779955 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:27:03.780126 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:03.780076 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mzzxk" podUID="1daccc38-8893-4df5-b7d6-357c27b4e705" Apr 24 21:27:03.780255 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:03.780178 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h6cws" podUID="500cc7d2-1561-40b2-956f-6e2b94ec6ebc" Apr 24 21:27:03.780302 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:03.780282 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4ck8" podUID="52e9a5f8-832a-4f1e-add3-f10bf674757e" Apr 24 21:27:03.960609 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:03.960527 2574 generic.go:358] "Generic (PLEG): container finished" podID="425e4fdf-8950-4297-b9c8-488b3e610f40" containerID="705a0f2feeae4474af1cf9a9434cf324c063a34baeca633f1732242d8aaedceb" exitCode=0 Apr 24 21:27:03.961026 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:03.960604 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hg8hv" event={"ID":"425e4fdf-8950-4297-b9c8-488b3e610f40","Type":"ContainerDied","Data":"705a0f2feeae4474af1cf9a9434cf324c063a34baeca633f1732242d8aaedceb"} Apr 24 21:27:03.964330 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:03.964312 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 21:27:03.964740 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:03.964660 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" event={"ID":"6da7ad0f-d1bd-4849-8159-3fc9d885aaa7","Type":"ContainerStarted","Data":"970e981c9cbc47585809ed755223164e7f65eb82c34a9dec590a46db81f8769d"} Apr 24 21:27:03.964841 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:03.964752 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:27:03.979571 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:03.979545 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:27:04.023037 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:04.022989 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" podStartSLOduration=10.055760334 podStartE2EDuration="27.022975592s" podCreationTimestamp="2026-04-24 21:26:37 +0000 UTC" firstStartedPulling="2026-04-24 21:26:40.372408782 +0000 UTC m=+3.144945868" lastFinishedPulling="2026-04-24 21:26:57.339624034 +0000 UTC m=+20.112161126" observedRunningTime="2026-04-24 21:27:04.021689688 +0000 UTC m=+26.794226826" watchObservedRunningTime="2026-04-24 21:27:04.022975592 +0000 UTC m=+26.795512694" Apr 24 21:27:04.262968 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:04.262758 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-h6cws"] Apr 24 21:27:04.263134 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:04.262997 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:27:04.263214 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:04.263157 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h6cws" podUID="500cc7d2-1561-40b2-956f-6e2b94ec6ebc" Apr 24 21:27:04.267287 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:04.267249 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c4ck8"] Apr 24 21:27:04.267429 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:04.267364 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:27:04.267493 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:04.267474 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4ck8" podUID="52e9a5f8-832a-4f1e-add3-f10bf674757e" Apr 24 21:27:04.268015 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:04.267991 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mzzxk"] Apr 24 21:27:04.268144 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:04.268073 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:27:04.268218 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:04.268182 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mzzxk" podUID="1daccc38-8893-4df5-b7d6-357c27b4e705" Apr 24 21:27:04.968778 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:04.968731 2574 generic.go:358] "Generic (PLEG): container finished" podID="425e4fdf-8950-4297-b9c8-488b3e610f40" containerID="1d802cb8049ce95ae846aa9d3213c46ac9a4fec0791667d42e8f885a9a84e421" exitCode=0 Apr 24 21:27:04.968778 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:04.968766 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hg8hv" event={"ID":"425e4fdf-8950-4297-b9c8-488b3e610f40","Type":"ContainerDied","Data":"1d802cb8049ce95ae846aa9d3213c46ac9a4fec0791667d42e8f885a9a84e421"} Apr 24 21:27:05.779492 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:05.779464 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:27:05.779671 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:05.779460 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:27:05.779739 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:05.779464 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:27:05.779739 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:05.779718 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h6cws" podUID="500cc7d2-1561-40b2-956f-6e2b94ec6ebc" Apr 24 21:27:05.779841 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:05.779594 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4ck8" podUID="52e9a5f8-832a-4f1e-add3-f10bf674757e" Apr 24 21:27:05.779841 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:05.779764 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mzzxk" podUID="1daccc38-8893-4df5-b7d6-357c27b4e705" Apr 24 21:27:07.780515 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:07.780478 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:27:07.781321 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:07.780523 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:27:07.781321 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:07.780590 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h6cws" podUID="500cc7d2-1561-40b2-956f-6e2b94ec6ebc" Apr 24 21:27:07.781321 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:07.780665 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mzzxk" podUID="1daccc38-8893-4df5-b7d6-357c27b4e705" Apr 24 21:27:07.781321 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:07.780694 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:27:07.781321 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:07.780790 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4ck8" podUID="52e9a5f8-832a-4f1e-add3-f10bf674757e" Apr 24 21:27:09.779111 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:09.779073 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:27:09.779552 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:09.779132 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:27:09.779552 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:09.779221 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mzzxk" podUID="1daccc38-8893-4df5-b7d6-357c27b4e705" Apr 24 21:27:09.779552 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:09.779290 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:27:09.779552 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:09.779294 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4ck8" podUID="52e9a5f8-832a-4f1e-add3-f10bf674757e" Apr 24 21:27:09.779552 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:09.779370 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h6cws" podUID="500cc7d2-1561-40b2-956f-6e2b94ec6ebc" Apr 24 21:27:10.542279 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.542252 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-209.ec2.internal" event="NodeReady" Apr 24 21:27:10.542462 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.542400 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:27:10.610695 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.610660 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2xr7h"] Apr 24 21:27:10.624680 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.624656 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-w7kpc"] Apr 24 21:27:10.624822 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.624801 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:27:10.628071 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.628049 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-42hwh\"" Apr 24 21:27:10.628207 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.628078 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:27:10.628207 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.628174 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:27:10.628422 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.628405 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:27:10.644318 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.644295 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2xr7h"] Apr 24 21:27:10.644442 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.644324 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w7kpc"] Apr 24 21:27:10.644442 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.644430 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w7kpc" Apr 24 21:27:10.647162 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.647145 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:27:10.647642 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.647626 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mzt4d\"" Apr 24 21:27:10.647726 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.647651 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:27:10.702760 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.702727 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert\") pod \"ingress-canary-2xr7h\" (UID: \"59fef02e-c780-4d6e-a4b6-1ffe904c5a5a\") " pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:27:10.702924 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.702784 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fls74\" (UniqueName: \"kubernetes.io/projected/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-kube-api-access-fls74\") pod \"ingress-canary-2xr7h\" (UID: \"59fef02e-c780-4d6e-a4b6-1ffe904c5a5a\") " pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:27:10.803127 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.803020 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:27:10.803127 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.803069 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert\") pod \"ingress-canary-2xr7h\" (UID: \"59fef02e-c780-4d6e-a4b6-1ffe904c5a5a\") " pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:27:10.803127 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.803087 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/629df89e-192f-4942-ba14-4cb4f95cef70-tmp-dir\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:27:10.803603 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.803137 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fls74\" (UniqueName: \"kubernetes.io/projected/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-kube-api-access-fls74\") pod \"ingress-canary-2xr7h\" (UID: \"59fef02e-c780-4d6e-a4b6-1ffe904c5a5a\") " pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:27:10.803603 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.803154 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/629df89e-192f-4942-ba14-4cb4f95cef70-config-volume\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:27:10.803603 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.803172 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d4xd\" (UniqueName: \"kubernetes.io/projected/629df89e-192f-4942-ba14-4cb4f95cef70-kube-api-access-2d4xd\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:27:10.803603 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:10.803198 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:10.803603 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:10.803267 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert podName:59fef02e-c780-4d6e-a4b6-1ffe904c5a5a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:11.30325071 +0000 UTC m=+34.075787791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert") pod "ingress-canary-2xr7h" (UID: "59fef02e-c780-4d6e-a4b6-1ffe904c5a5a") : secret "canary-serving-cert" not found Apr 24 21:27:10.816392 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.816252 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fls74\" (UniqueName: \"kubernetes.io/projected/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-kube-api-access-fls74\") pod \"ingress-canary-2xr7h\" (UID: \"59fef02e-c780-4d6e-a4b6-1ffe904c5a5a\") " pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:27:10.904325 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.904291 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:27:10.904491 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.904353 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/629df89e-192f-4942-ba14-4cb4f95cef70-tmp-dir\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:27:10.904491 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.904388 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/629df89e-192f-4942-ba14-4cb4f95cef70-config-volume\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:27:10.904491 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.904410 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2d4xd\" (UniqueName: \"kubernetes.io/projected/629df89e-192f-4942-ba14-4cb4f95cef70-kube-api-access-2d4xd\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:27:10.904491 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:10.904434 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:10.904692 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:10.904496 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls podName:629df89e-192f-4942-ba14-4cb4f95cef70 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:11.404481522 +0000 UTC m=+34.177018607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls") pod "dns-default-w7kpc" (UID: "629df89e-192f-4942-ba14-4cb4f95cef70") : secret "dns-default-metrics-tls" not found Apr 24 21:27:10.904985 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.904965 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/629df89e-192f-4942-ba14-4cb4f95cef70-config-volume\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:27:10.915801 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.915781 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/629df89e-192f-4942-ba14-4cb4f95cef70-tmp-dir\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:27:10.922472 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:10.922453 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d4xd\" (UniqueName: \"kubernetes.io/projected/629df89e-192f-4942-ba14-4cb4f95cef70-kube-api-access-2d4xd\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:27:11.307151 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:11.307093 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert\") pod \"ingress-canary-2xr7h\" (UID: \"59fef02e-c780-4d6e-a4b6-1ffe904c5a5a\") " pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:27:11.307316 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:11.307267 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:11.307378 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:11.307347 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert podName:59fef02e-c780-4d6e-a4b6-1ffe904c5a5a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:12.30732453 +0000 UTC m=+35.079861628 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert") pod "ingress-canary-2xr7h" (UID: "59fef02e-c780-4d6e-a4b6-1ffe904c5a5a") : secret "canary-serving-cert" not found Apr 24 21:27:11.408227 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:11.408132 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs\") pod \"network-metrics-daemon-c4ck8\" (UID: \"52e9a5f8-832a-4f1e-add3-f10bf674757e\") " pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:27:11.408227 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:11.408196 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:27:11.408419 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:11.408285 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:11.408419 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:11.408338 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:11.408419 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:11.408351 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs podName:52e9a5f8-832a-4f1e-add3-f10bf674757e nodeName:}" failed. No retries permitted until 2026-04-24 21:27:43.408335996 +0000 UTC m=+66.180873077 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs") pod "network-metrics-daemon-c4ck8" (UID: "52e9a5f8-832a-4f1e-add3-f10bf674757e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:11.408419 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:11.408385 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls podName:629df89e-192f-4942-ba14-4cb4f95cef70 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:12.408372603 +0000 UTC m=+35.180909707 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls") pod "dns-default-w7kpc" (UID: "629df89e-192f-4942-ba14-4cb4f95cef70") : secret "dns-default-metrics-tls" not found Apr 24 21:27:11.509219 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:11.509178 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret\") pod \"global-pull-secret-syncer-h6cws\" (UID: \"500cc7d2-1561-40b2-956f-6e2b94ec6ebc\") " pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:27:11.509366 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:11.509251 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfpb7\" (UniqueName: \"kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7\") pod \"network-check-target-mzzxk\" (UID: \"1daccc38-8893-4df5-b7d6-357c27b4e705\") " pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:27:11.509366 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:11.509322 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:11.509366 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:11.509359 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:11.509459 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:11.509377 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:11.509459 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:11.509386 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret podName:500cc7d2-1561-40b2-956f-6e2b94ec6ebc nodeName:}" failed. No retries permitted until 2026-04-24 21:27:27.509369446 +0000 UTC m=+50.281906527 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret") pod "global-pull-secret-syncer-h6cws" (UID: "500cc7d2-1561-40b2-956f-6e2b94ec6ebc") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:11.509459 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:11.509387 2574 projected.go:194] Error preparing data for projected volume kube-api-access-jfpb7 for pod openshift-network-diagnostics/network-check-target-mzzxk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:11.509459 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:11.509414 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7 podName:1daccc38-8893-4df5-b7d6-357c27b4e705 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:43.509408823 +0000 UTC m=+66.281945949 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-jfpb7" (UniqueName: "kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7") pod "network-check-target-mzzxk" (UID: "1daccc38-8893-4df5-b7d6-357c27b4e705") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:11.778905 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:11.778867 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:27:11.779063 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:11.778948 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:27:11.779063 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:11.779048 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:27:11.781920 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:11.781900 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:27:11.782233 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:11.782212 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:27:11.782431 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:11.782413 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:27:11.783399 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:11.783382 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5kts7\"" Apr 24 21:27:11.783490 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:11.783412 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hvhzf\"" Apr 24 21:27:11.783490 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:11.783412 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:27:11.985041 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:11.985006 2574 generic.go:358] "Generic (PLEG): container finished" podID="425e4fdf-8950-4297-b9c8-488b3e610f40" containerID="cc704ab180b9e0666db5298ce093349142a0750e9db1f6525974f23115c5d20f" exitCode=0 Apr 24 21:27:11.985404 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:11.985068 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hg8hv" event={"ID":"425e4fdf-8950-4297-b9c8-488b3e610f40","Type":"ContainerDied","Data":"cc704ab180b9e0666db5298ce093349142a0750e9db1f6525974f23115c5d20f"} Apr 24 21:27:12.314664 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:12.314627 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert\") pod \"ingress-canary-2xr7h\" (UID: \"59fef02e-c780-4d6e-a4b6-1ffe904c5a5a\") " pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:27:12.314817 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:12.314766 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:12.314854 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:12.314829 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert podName:59fef02e-c780-4d6e-a4b6-1ffe904c5a5a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:14.314814184 +0000 UTC m=+37.087351265 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert") pod "ingress-canary-2xr7h" (UID: "59fef02e-c780-4d6e-a4b6-1ffe904c5a5a") : secret "canary-serving-cert" not found Apr 24 21:27:12.415553 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:12.415468 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:27:12.415716 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:12.415592 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:12.415716 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:12.415658 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls podName:629df89e-192f-4942-ba14-4cb4f95cef70 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:14.415639302 +0000 UTC m=+37.188176400 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls") pod "dns-default-w7kpc" (UID: "629df89e-192f-4942-ba14-4cb4f95cef70") : secret "dns-default-metrics-tls" not found Apr 24 21:27:12.989824 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:12.989794 2574 generic.go:358] "Generic (PLEG): container finished" podID="425e4fdf-8950-4297-b9c8-488b3e610f40" containerID="8240a6c0c7cea060149cf67246cc0b44bcd822689ad90cd563c87d24051a4488" exitCode=0 Apr 24 21:27:12.990308 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:12.989846 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hg8hv" event={"ID":"425e4fdf-8950-4297-b9c8-488b3e610f40","Type":"ContainerDied","Data":"8240a6c0c7cea060149cf67246cc0b44bcd822689ad90cd563c87d24051a4488"} Apr 24 21:27:13.994040 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:13.994002 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hg8hv" event={"ID":"425e4fdf-8950-4297-b9c8-488b3e610f40","Type":"ContainerStarted","Data":"90caa0d9c154b3b97ecbd524d743a28e4c0814f8521095f379a0c64f56297d79"} Apr 24 21:27:14.017226 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:14.017185 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hg8hv" podStartSLOduration=6.434558804 podStartE2EDuration="37.017171921s" podCreationTimestamp="2026-04-24 21:26:37 +0000 UTC" firstStartedPulling="2026-04-24 21:26:40.374111499 +0000 UTC m=+3.146648593" lastFinishedPulling="2026-04-24 21:27:10.956724615 +0000 UTC m=+33.729261710" observedRunningTime="2026-04-24 21:27:14.015876479 +0000 UTC m=+36.788413605" watchObservedRunningTime="2026-04-24 21:27:14.017171921 +0000 UTC m=+36.789709024" Apr 24 21:27:14.329459 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:14.329422 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert\") pod \"ingress-canary-2xr7h\" (UID: \"59fef02e-c780-4d6e-a4b6-1ffe904c5a5a\") " pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:27:14.329630 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:14.329543 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:14.329630 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:14.329594 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert podName:59fef02e-c780-4d6e-a4b6-1ffe904c5a5a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:18.329580569 +0000 UTC m=+41.102117650 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert") pod "ingress-canary-2xr7h" (UID: "59fef02e-c780-4d6e-a4b6-1ffe904c5a5a") : secret "canary-serving-cert" not found Apr 24 21:27:14.430473 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:14.430441 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:27:14.430623 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:14.430566 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:14.430623 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:14.430616 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls podName:629df89e-192f-4942-ba14-4cb4f95cef70 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:18.430603266 +0000 UTC m=+41.203140347 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls") pod "dns-default-w7kpc" (UID: "629df89e-192f-4942-ba14-4cb4f95cef70") : secret "dns-default-metrics-tls" not found Apr 24 21:27:17.776638 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.776583 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc"] Apr 24 21:27:17.782349 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.782330 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" Apr 24 21:27:17.784806 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.784787 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 21:27:17.785846 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.785826 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 21:27:17.785948 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.785878 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 21:27:17.785948 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.785893 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 21:27:17.788565 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.788546 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc"] Apr 24 21:27:17.805326 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.805305 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp"] Apr 24 21:27:17.809030 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.809016 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:17.811519 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.811499 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 21:27:17.811852 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.811833 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 21:27:17.811991 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.811974 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 21:27:17.812122 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.812093 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 21:27:17.825191 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.825167 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp"] Apr 24 21:27:17.953163 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.953130 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1f6ad131-bc14-42db-b3dd-6c6c7dfc8758-tmp\") pod \"klusterlet-addon-workmgr-69bb95d676-llqgc\" (UID: \"1f6ad131-bc14-42db-b3dd-6c6c7dfc8758\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" Apr 24 21:27:17.953335 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.953184 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/053382cc-bd8f-45e7-b716-226dc9c02245-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f584d4588-5rfcp\" (UID: \"053382cc-bd8f-45e7-b716-226dc9c02245\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:17.953335 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.953240 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/053382cc-bd8f-45e7-b716-226dc9c02245-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f584d4588-5rfcp\" (UID: \"053382cc-bd8f-45e7-b716-226dc9c02245\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:17.953335 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.953284 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/1f6ad131-bc14-42db-b3dd-6c6c7dfc8758-klusterlet-config\") pod \"klusterlet-addon-workmgr-69bb95d676-llqgc\" (UID: \"1f6ad131-bc14-42db-b3dd-6c6c7dfc8758\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" Apr 24 21:27:17.953335 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.953302 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjjr2\" (UniqueName: \"kubernetes.io/projected/1f6ad131-bc14-42db-b3dd-6c6c7dfc8758-kube-api-access-sjjr2\") pod \"klusterlet-addon-workmgr-69bb95d676-llqgc\" (UID: \"1f6ad131-bc14-42db-b3dd-6c6c7dfc8758\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" Apr 24 21:27:17.953335 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.953318 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m58q4\" (UniqueName: \"kubernetes.io/projected/053382cc-bd8f-45e7-b716-226dc9c02245-kube-api-access-m58q4\") pod \"cluster-proxy-proxy-agent-f584d4588-5rfcp\" (UID: \"053382cc-bd8f-45e7-b716-226dc9c02245\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:17.953580 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.953473 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/053382cc-bd8f-45e7-b716-226dc9c02245-ca\") pod \"cluster-proxy-proxy-agent-f584d4588-5rfcp\" (UID: \"053382cc-bd8f-45e7-b716-226dc9c02245\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:17.953580 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.953511 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/053382cc-bd8f-45e7-b716-226dc9c02245-hub\") pod \"cluster-proxy-proxy-agent-f584d4588-5rfcp\" (UID: \"053382cc-bd8f-45e7-b716-226dc9c02245\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:17.953580 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:17.953561 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/053382cc-bd8f-45e7-b716-226dc9c02245-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f584d4588-5rfcp\" (UID: \"053382cc-bd8f-45e7-b716-226dc9c02245\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:18.054632 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.054547 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/053382cc-bd8f-45e7-b716-226dc9c02245-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f584d4588-5rfcp\" (UID: \"053382cc-bd8f-45e7-b716-226dc9c02245\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:18.054632 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.054590 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/053382cc-bd8f-45e7-b716-226dc9c02245-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f584d4588-5rfcp\" (UID: \"053382cc-bd8f-45e7-b716-226dc9c02245\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:18.054847 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.054641 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/1f6ad131-bc14-42db-b3dd-6c6c7dfc8758-klusterlet-config\") pod \"klusterlet-addon-workmgr-69bb95d676-llqgc\" (UID: \"1f6ad131-bc14-42db-b3dd-6c6c7dfc8758\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" Apr 24 21:27:18.054847 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.054664 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjjr2\" (UniqueName: \"kubernetes.io/projected/1f6ad131-bc14-42db-b3dd-6c6c7dfc8758-kube-api-access-sjjr2\") pod \"klusterlet-addon-workmgr-69bb95d676-llqgc\" (UID: \"1f6ad131-bc14-42db-b3dd-6c6c7dfc8758\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" Apr 24 21:27:18.054847 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.054687 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m58q4\" (UniqueName: \"kubernetes.io/projected/053382cc-bd8f-45e7-b716-226dc9c02245-kube-api-access-m58q4\") pod \"cluster-proxy-proxy-agent-f584d4588-5rfcp\" (UID: \"053382cc-bd8f-45e7-b716-226dc9c02245\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:18.054995 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.054880 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/053382cc-bd8f-45e7-b716-226dc9c02245-ca\") pod \"cluster-proxy-proxy-agent-f584d4588-5rfcp\" (UID: \"053382cc-bd8f-45e7-b716-226dc9c02245\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:18.054995 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.054916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/053382cc-bd8f-45e7-b716-226dc9c02245-hub\") pod \"cluster-proxy-proxy-agent-f584d4588-5rfcp\" (UID: \"053382cc-bd8f-45e7-b716-226dc9c02245\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:18.054995 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.054946 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/053382cc-bd8f-45e7-b716-226dc9c02245-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f584d4588-5rfcp\" (UID: \"053382cc-bd8f-45e7-b716-226dc9c02245\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:18.054995 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.054982 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1f6ad131-bc14-42db-b3dd-6c6c7dfc8758-tmp\") pod \"klusterlet-addon-workmgr-69bb95d676-llqgc\" (UID: \"1f6ad131-bc14-42db-b3dd-6c6c7dfc8758\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" Apr 24 21:27:18.055383 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.055356 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1f6ad131-bc14-42db-b3dd-6c6c7dfc8758-tmp\") pod \"klusterlet-addon-workmgr-69bb95d676-llqgc\" (UID: \"1f6ad131-bc14-42db-b3dd-6c6c7dfc8758\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" Apr 24 21:27:18.055770 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.055741 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/053382cc-bd8f-45e7-b716-226dc9c02245-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f584d4588-5rfcp\" (UID: \"053382cc-bd8f-45e7-b716-226dc9c02245\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:18.058029 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.058009 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/053382cc-bd8f-45e7-b716-226dc9c02245-ca\") pod \"cluster-proxy-proxy-agent-f584d4588-5rfcp\" (UID: \"053382cc-bd8f-45e7-b716-226dc9c02245\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:18.058157 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.058058 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/053382cc-bd8f-45e7-b716-226dc9c02245-hub\") pod \"cluster-proxy-proxy-agent-f584d4588-5rfcp\" (UID: \"053382cc-bd8f-45e7-b716-226dc9c02245\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:18.058157 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.058127 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/053382cc-bd8f-45e7-b716-226dc9c02245-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f584d4588-5rfcp\" (UID: \"053382cc-bd8f-45e7-b716-226dc9c02245\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:18.058157 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.058143 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/1f6ad131-bc14-42db-b3dd-6c6c7dfc8758-klusterlet-config\") pod \"klusterlet-addon-workmgr-69bb95d676-llqgc\" (UID: \"1f6ad131-bc14-42db-b3dd-6c6c7dfc8758\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" Apr 24 21:27:18.058588 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.058569 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/053382cc-bd8f-45e7-b716-226dc9c02245-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f584d4588-5rfcp\" (UID: \"053382cc-bd8f-45e7-b716-226dc9c02245\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:18.066124 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.066082 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjjr2\" (UniqueName: \"kubernetes.io/projected/1f6ad131-bc14-42db-b3dd-6c6c7dfc8758-kube-api-access-sjjr2\") pod \"klusterlet-addon-workmgr-69bb95d676-llqgc\" (UID: \"1f6ad131-bc14-42db-b3dd-6c6c7dfc8758\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" Apr 24 21:27:18.066393 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.066372 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m58q4\" (UniqueName: \"kubernetes.io/projected/053382cc-bd8f-45e7-b716-226dc9c02245-kube-api-access-m58q4\") pod \"cluster-proxy-proxy-agent-f584d4588-5rfcp\" (UID: \"053382cc-bd8f-45e7-b716-226dc9c02245\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:18.092444 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.092427 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" Apr 24 21:27:18.130529 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.130502 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:27:18.227928 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.227893 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc"] Apr 24 21:27:18.231978 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:27:18.231942 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f6ad131_bc14_42db_b3dd_6c6c7dfc8758.slice/crio-d1c6c4301d6ce236d15ad9b96a45bcc6e8016b019ceb409f809411f4fd6d8cd6 WatchSource:0}: Error finding container d1c6c4301d6ce236d15ad9b96a45bcc6e8016b019ceb409f809411f4fd6d8cd6: Status 404 returned error can't find the container with id d1c6c4301d6ce236d15ad9b96a45bcc6e8016b019ceb409f809411f4fd6d8cd6 Apr 24 21:27:18.255029 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.254342 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp"] Apr 24 21:27:18.260223 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:27:18.260200 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod053382cc_bd8f_45e7_b716_226dc9c02245.slice/crio-ac5b1a996a136b2bede1d2545a95aea09a79e9acb472527da0a90835b6429c4b WatchSource:0}: Error finding container ac5b1a996a136b2bede1d2545a95aea09a79e9acb472527da0a90835b6429c4b: Status 404 returned error can't find the container with id ac5b1a996a136b2bede1d2545a95aea09a79e9acb472527da0a90835b6429c4b Apr 24 21:27:18.357372 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.357289 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert\") pod \"ingress-canary-2xr7h\" (UID: \"59fef02e-c780-4d6e-a4b6-1ffe904c5a5a\") " pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:27:18.357506 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:18.357429 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:18.357506 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:18.357494 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert podName:59fef02e-c780-4d6e-a4b6-1ffe904c5a5a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:26.357478126 +0000 UTC m=+49.130015206 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert") pod "ingress-canary-2xr7h" (UID: "59fef02e-c780-4d6e-a4b6-1ffe904c5a5a") : secret "canary-serving-cert" not found Apr 24 21:27:18.458209 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:18.458167 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:27:18.458377 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:18.458323 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:18.458430 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:18.458405 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls podName:629df89e-192f-4942-ba14-4cb4f95cef70 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:26.458386832 +0000 UTC m=+49.230923912 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls") pod "dns-default-w7kpc" (UID: "629df89e-192f-4942-ba14-4cb4f95cef70") : secret "dns-default-metrics-tls" not found Apr 24 21:27:19.008966 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:19.008898 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" event={"ID":"1f6ad131-bc14-42db-b3dd-6c6c7dfc8758","Type":"ContainerStarted","Data":"d1c6c4301d6ce236d15ad9b96a45bcc6e8016b019ceb409f809411f4fd6d8cd6"} Apr 24 21:27:19.011632 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:19.011600 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" event={"ID":"053382cc-bd8f-45e7-b716-226dc9c02245","Type":"ContainerStarted","Data":"ac5b1a996a136b2bede1d2545a95aea09a79e9acb472527da0a90835b6429c4b"} Apr 24 21:27:24.024501 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:24.024449 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" event={"ID":"053382cc-bd8f-45e7-b716-226dc9c02245","Type":"ContainerStarted","Data":"8ad0550b2b5f977f073f1c1503a3d12321d1943c276ac46049aa08725212fd30"} Apr 24 21:27:24.025999 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:24.025955 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" event={"ID":"1f6ad131-bc14-42db-b3dd-6c6c7dfc8758","Type":"ContainerStarted","Data":"1155bac5b0436cf31b3f09df3264f150e8cbd71cd492d44a95360260b29aee75"} Apr 24 21:27:24.026250 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:24.026232 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" Apr 24 21:27:24.028223 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:24.028201 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" Apr 24 21:27:24.045347 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:24.045298 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" podStartSLOduration=2.198346313 podStartE2EDuration="7.045281151s" podCreationTimestamp="2026-04-24 21:27:17 +0000 UTC" firstStartedPulling="2026-04-24 21:27:18.234075831 +0000 UTC m=+41.006612916" lastFinishedPulling="2026-04-24 21:27:23.081010668 +0000 UTC m=+45.853547754" observedRunningTime="2026-04-24 21:27:24.044207177 +0000 UTC m=+46.816744281" watchObservedRunningTime="2026-04-24 21:27:24.045281151 +0000 UTC m=+46.817818257" Apr 24 21:27:26.031645 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:26.031607 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" event={"ID":"053382cc-bd8f-45e7-b716-226dc9c02245","Type":"ContainerStarted","Data":"e0219301e1bbfea652d92541b7917322c14dde458b4a3b5e76a7e239718eafc7"} Apr 24 21:27:26.031645 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:26.031647 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" event={"ID":"053382cc-bd8f-45e7-b716-226dc9c02245","Type":"ContainerStarted","Data":"d6d2b092e904f1e8154f26490581517576bdbaa059786a5bfa0ccf73afe068fe"} Apr 24 21:27:26.054773 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:26.054727 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" podStartSLOduration=1.799209629 podStartE2EDuration="9.054713242s" podCreationTimestamp="2026-04-24 21:27:17 +0000 UTC" firstStartedPulling="2026-04-24 21:27:18.261841626 +0000 UTC m=+41.034378707" lastFinishedPulling="2026-04-24 21:27:25.517345226 +0000 UTC m=+48.289882320" observedRunningTime="2026-04-24 21:27:26.053238154 +0000 UTC m=+48.825775257" watchObservedRunningTime="2026-04-24 21:27:26.054713242 +0000 UTC m=+48.827250344" Apr 24 21:27:26.443965 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:26.443867 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert\") pod \"ingress-canary-2xr7h\" (UID: \"59fef02e-c780-4d6e-a4b6-1ffe904c5a5a\") " pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:27:26.444182 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:26.444016 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:26.444182 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:26.444090 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert podName:59fef02e-c780-4d6e-a4b6-1ffe904c5a5a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:42.444062243 +0000 UTC m=+65.216599341 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert") pod "ingress-canary-2xr7h" (UID: "59fef02e-c780-4d6e-a4b6-1ffe904c5a5a") : secret "canary-serving-cert" not found Apr 24 21:27:26.544571 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:26.544534 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:27:26.544731 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:26.544679 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:26.544778 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:26.544755 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls podName:629df89e-192f-4942-ba14-4cb4f95cef70 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:42.544737547 +0000 UTC m=+65.317274628 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls") pod "dns-default-w7kpc" (UID: "629df89e-192f-4942-ba14-4cb4f95cef70") : secret "dns-default-metrics-tls" not found Apr 24 21:27:27.551337 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:27.551294 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret\") pod \"global-pull-secret-syncer-h6cws\" (UID: \"500cc7d2-1561-40b2-956f-6e2b94ec6ebc\") " pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:27:27.553557 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:27.553527 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/500cc7d2-1561-40b2-956f-6e2b94ec6ebc-original-pull-secret\") pod \"global-pull-secret-syncer-h6cws\" (UID: \"500cc7d2-1561-40b2-956f-6e2b94ec6ebc\") " pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:27:27.688285 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:27.688246 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h6cws" Apr 24 21:27:27.805433 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:27.804812 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-h6cws"] Apr 24 21:27:27.808833 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:27:27.808798 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod500cc7d2_1561_40b2_956f_6e2b94ec6ebc.slice/crio-ab24efac6d2cdd444cb6731045caa9cafb60bee554e628ceebd16c1c7be1c641 WatchSource:0}: Error finding container ab24efac6d2cdd444cb6731045caa9cafb60bee554e628ceebd16c1c7be1c641: Status 404 returned error can't find the container with id ab24efac6d2cdd444cb6731045caa9cafb60bee554e628ceebd16c1c7be1c641 Apr 24 21:27:28.037171 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:28.037133 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-h6cws" event={"ID":"500cc7d2-1561-40b2-956f-6e2b94ec6ebc","Type":"ContainerStarted","Data":"ab24efac6d2cdd444cb6731045caa9cafb60bee554e628ceebd16c1c7be1c641"} Apr 24 21:27:32.047895 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:32.047801 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-h6cws" event={"ID":"500cc7d2-1561-40b2-956f-6e2b94ec6ebc","Type":"ContainerStarted","Data":"9087d0c72a7907e6ee1474b9bdf57a63871987efecaae1e6a6abd9fc1c85e807"} Apr 24 21:27:32.064093 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:32.064046 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-h6cws" podStartSLOduration=33.10790104 podStartE2EDuration="37.064029624s" podCreationTimestamp="2026-04-24 21:26:55 +0000 UTC" firstStartedPulling="2026-04-24 21:27:27.810442176 +0000 UTC m=+50.582979261" lastFinishedPulling="2026-04-24 21:27:31.766570749 +0000 UTC m=+54.539107845" observedRunningTime="2026-04-24 21:27:32.063396641 +0000 UTC m=+54.835933739" watchObservedRunningTime="2026-04-24 21:27:32.064029624 +0000 UTC m=+54.836566719" Apr 24 21:27:35.981837 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:35.981802 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9jw7k" Apr 24 21:27:42.463952 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:42.463909 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert\") pod \"ingress-canary-2xr7h\" (UID: \"59fef02e-c780-4d6e-a4b6-1ffe904c5a5a\") " pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:27:42.464444 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:42.464031 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:42.464444 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:42.464090 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert podName:59fef02e-c780-4d6e-a4b6-1ffe904c5a5a nodeName:}" failed. No retries permitted until 2026-04-24 21:28:14.46407677 +0000 UTC m=+97.236613852 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert") pod "ingress-canary-2xr7h" (UID: "59fef02e-c780-4d6e-a4b6-1ffe904c5a5a") : secret "canary-serving-cert" not found Apr 24 21:27:42.565228 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:42.565174 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:27:42.565364 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:42.565316 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:42.565402 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:42.565375 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls podName:629df89e-192f-4942-ba14-4cb4f95cef70 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:14.565360504 +0000 UTC m=+97.337897598 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls") pod "dns-default-w7kpc" (UID: "629df89e-192f-4942-ba14-4cb4f95cef70") : secret "dns-default-metrics-tls" not found Apr 24 21:27:43.471480 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:43.471436 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs\") pod \"network-metrics-daemon-c4ck8\" (UID: \"52e9a5f8-832a-4f1e-add3-f10bf674757e\") " pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:27:43.474229 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:43.474207 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:27:43.482369 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:43.482351 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:27:43.482433 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:27:43.482410 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs podName:52e9a5f8-832a-4f1e-add3-f10bf674757e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:47.482391911 +0000 UTC m=+130.254929005 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs") pod "network-metrics-daemon-c4ck8" (UID: "52e9a5f8-832a-4f1e-add3-f10bf674757e") : secret "metrics-daemon-secret" not found Apr 24 21:27:43.572029 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:43.571988 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfpb7\" (UniqueName: \"kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7\") pod \"network-check-target-mzzxk\" (UID: \"1daccc38-8893-4df5-b7d6-357c27b4e705\") " pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:27:43.576567 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:43.576549 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:27:43.585628 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:43.585606 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:27:43.596413 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:43.596392 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfpb7\" (UniqueName: \"kubernetes.io/projected/1daccc38-8893-4df5-b7d6-357c27b4e705-kube-api-access-jfpb7\") pod \"network-check-target-mzzxk\" (UID: \"1daccc38-8893-4df5-b7d6-357c27b4e705\") " pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:27:43.896366 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:43.896284 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hvhzf\"" Apr 24 21:27:43.904584 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:43.904554 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:27:44.018824 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:44.018793 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mzzxk"] Apr 24 21:27:44.021953 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:27:44.021921 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1daccc38_8893_4df5_b7d6_357c27b4e705.slice/crio-26da291a95b4c2d1570060607ff24a21d960269a28d1159d69cff4225b276154 WatchSource:0}: Error finding container 26da291a95b4c2d1570060607ff24a21d960269a28d1159d69cff4225b276154: Status 404 returned error can't find the container with id 26da291a95b4c2d1570060607ff24a21d960269a28d1159d69cff4225b276154 Apr 24 21:27:44.073693 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:44.073655 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mzzxk" event={"ID":"1daccc38-8893-4df5-b7d6-357c27b4e705","Type":"ContainerStarted","Data":"26da291a95b4c2d1570060607ff24a21d960269a28d1159d69cff4225b276154"} Apr 24 21:27:47.081227 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:47.081191 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mzzxk" event={"ID":"1daccc38-8893-4df5-b7d6-357c27b4e705","Type":"ContainerStarted","Data":"c4cc643a2657ce4fc6cd7ecbff3d68cdfcb7b394bfba6c4b0477ba250c11067d"} Apr 24 21:27:47.081588 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:47.081303 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:27:47.100505 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:27:47.100460 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mzzxk" podStartSLOduration=67.553718163 podStartE2EDuration="1m10.100445915s" podCreationTimestamp="2026-04-24 21:26:37 +0000 UTC" firstStartedPulling="2026-04-24 21:27:44.023809528 +0000 UTC m=+66.796346613" lastFinishedPulling="2026-04-24 21:27:46.570537268 +0000 UTC m=+69.343074365" observedRunningTime="2026-04-24 21:27:47.099487678 +0000 UTC m=+69.872024783" watchObservedRunningTime="2026-04-24 21:27:47.100445915 +0000 UTC m=+69.872983067" Apr 24 21:28:14.492182 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:28:14.492149 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert\") pod \"ingress-canary-2xr7h\" (UID: \"59fef02e-c780-4d6e-a4b6-1ffe904c5a5a\") " pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:28:14.492608 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:28:14.492307 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:14.492608 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:28:14.492377 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert podName:59fef02e-c780-4d6e-a4b6-1ffe904c5a5a nodeName:}" failed. No retries permitted until 2026-04-24 21:29:18.492358828 +0000 UTC m=+161.264895925 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert") pod "ingress-canary-2xr7h" (UID: "59fef02e-c780-4d6e-a4b6-1ffe904c5a5a") : secret "canary-serving-cert" not found Apr 24 21:28:14.593393 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:28:14.593356 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:28:14.593578 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:28:14.593468 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:14.593578 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:28:14.593532 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls podName:629df89e-192f-4942-ba14-4cb4f95cef70 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:18.593517688 +0000 UTC m=+161.366054773 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls") pod "dns-default-w7kpc" (UID: "629df89e-192f-4942-ba14-4cb4f95cef70") : secret "dns-default-metrics-tls" not found Apr 24 21:28:18.087138 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:28:18.087088 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mzzxk" Apr 24 21:28:47.517423 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:28:47.517383 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs\") pod \"network-metrics-daemon-c4ck8\" (UID: \"52e9a5f8-832a-4f1e-add3-f10bf674757e\") " pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:28:47.517904 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:28:47.517500 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:28:47.517904 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:28:47.517569 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs podName:52e9a5f8-832a-4f1e-add3-f10bf674757e nodeName:}" failed. No retries permitted until 2026-04-24 21:30:49.51755195 +0000 UTC m=+252.290089049 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs") pod "network-metrics-daemon-c4ck8" (UID: "52e9a5f8-832a-4f1e-add3-f10bf674757e") : secret "metrics-daemon-secret" not found Apr 24 21:29:13.634526 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:29:13.634479 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-2xr7h" podUID="59fef02e-c780-4d6e-a4b6-1ffe904c5a5a" Apr 24 21:29:13.652634 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:29:13.652604 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-w7kpc" podUID="629df89e-192f-4942-ba14-4cb4f95cef70" Apr 24 21:29:14.279437 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:14.279405 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:29:14.798596 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:29:14.798549 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-c4ck8" podUID="52e9a5f8-832a-4f1e-add3-f10bf674757e" Apr 24 21:29:18.533182 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:18.533140 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert\") pod \"ingress-canary-2xr7h\" (UID: \"59fef02e-c780-4d6e-a4b6-1ffe904c5a5a\") " pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:29:18.533651 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:29:18.533256 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:18.533651 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:29:18.533321 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert podName:59fef02e-c780-4d6e-a4b6-1ffe904c5a5a nodeName:}" failed. No retries permitted until 2026-04-24 21:31:20.533304133 +0000 UTC m=+283.305841232 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert") pod "ingress-canary-2xr7h" (UID: "59fef02e-c780-4d6e-a4b6-1ffe904c5a5a") : secret "canary-serving-cert" not found Apr 24 21:29:18.597545 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:18.597516 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6jctc_3fe6ee90-5f50-4532-8f34-d91e4dc1fccd/dns-node-resolver/0.log" Apr 24 21:29:18.634173 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:18.634139 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:29:18.634300 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:29:18.634280 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:18.634369 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:29:18.634359 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls podName:629df89e-192f-4942-ba14-4cb4f95cef70 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:20.634342677 +0000 UTC m=+283.406879774 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls") pod "dns-default-w7kpc" (UID: "629df89e-192f-4942-ba14-4cb4f95cef70") : secret "dns-default-metrics-tls" not found Apr 24 21:29:19.998189 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:19.998159 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sfdpv_572a1510-69a8-48bf-94b2-311fd0c0d92f/node-ca/0.log" Apr 24 21:29:23.301891 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:23.301859 2574 generic.go:358] "Generic (PLEG): container finished" podID="1f6ad131-bc14-42db-b3dd-6c6c7dfc8758" containerID="1155bac5b0436cf31b3f09df3264f150e8cbd71cd492d44a95360260b29aee75" exitCode=1 Apr 24 21:29:23.302249 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:23.301898 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" event={"ID":"1f6ad131-bc14-42db-b3dd-6c6c7dfc8758","Type":"ContainerDied","Data":"1155bac5b0436cf31b3f09df3264f150e8cbd71cd492d44a95360260b29aee75"} Apr 24 21:29:23.302249 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:23.302221 2574 scope.go:117] "RemoveContainer" containerID="1155bac5b0436cf31b3f09df3264f150e8cbd71cd492d44a95360260b29aee75" Apr 24 21:29:24.026866 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:24.026819 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" Apr 24 21:29:24.305805 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:24.305709 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" event={"ID":"1f6ad131-bc14-42db-b3dd-6c6c7dfc8758","Type":"ContainerStarted","Data":"9c8b0cc1effb0116ac7033c964f04e2bdd3f113d7c7351e56fbc87cfd89cf844"} Apr 24 21:29:24.306196 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:24.305992 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" Apr 24 21:29:24.306636 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:24.306615 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-69bb95d676-llqgc" Apr 24 21:29:25.779108 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:25.779072 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w7kpc" Apr 24 21:29:27.780333 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:27.780295 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:29:40.552404 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.552363 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-p4ctn"] Apr 24 21:29:40.555529 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.555507 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-p4ctn" Apr 24 21:29:40.561727 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.561703 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:29:40.561842 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.561775 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:29:40.562404 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.562390 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:29:40.562787 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.562773 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:29:40.562870 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.562853 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-b67gj\"" Apr 24 21:29:40.583057 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.583033 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-p4ctn"] Apr 24 21:29:40.694216 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.694178 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/aad14a0d-3139-41e7-b2f7-77dbdab344ac-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-p4ctn\" (UID: \"aad14a0d-3139-41e7-b2f7-77dbdab344ac\") " pod="openshift-insights/insights-runtime-extractor-p4ctn" Apr 24 21:29:40.694376 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.694237 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/aad14a0d-3139-41e7-b2f7-77dbdab344ac-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-p4ctn\" (UID: \"aad14a0d-3139-41e7-b2f7-77dbdab344ac\") " pod="openshift-insights/insights-runtime-extractor-p4ctn" Apr 24 21:29:40.694376 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.694287 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s29nh\" (UniqueName: \"kubernetes.io/projected/aad14a0d-3139-41e7-b2f7-77dbdab344ac-kube-api-access-s29nh\") pod \"insights-runtime-extractor-p4ctn\" (UID: \"aad14a0d-3139-41e7-b2f7-77dbdab344ac\") " pod="openshift-insights/insights-runtime-extractor-p4ctn" Apr 24 21:29:40.694376 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.694339 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/aad14a0d-3139-41e7-b2f7-77dbdab344ac-crio-socket\") pod \"insights-runtime-extractor-p4ctn\" (UID: \"aad14a0d-3139-41e7-b2f7-77dbdab344ac\") " pod="openshift-insights/insights-runtime-extractor-p4ctn" Apr 24 21:29:40.694376 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.694360 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/aad14a0d-3139-41e7-b2f7-77dbdab344ac-data-volume\") pod \"insights-runtime-extractor-p4ctn\" (UID: \"aad14a0d-3139-41e7-b2f7-77dbdab344ac\") " pod="openshift-insights/insights-runtime-extractor-p4ctn" Apr 24 21:29:40.794799 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.794769 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/aad14a0d-3139-41e7-b2f7-77dbdab344ac-crio-socket\") pod \"insights-runtime-extractor-p4ctn\" (UID: \"aad14a0d-3139-41e7-b2f7-77dbdab344ac\") " pod="openshift-insights/insights-runtime-extractor-p4ctn" Apr 24 21:29:40.794799 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.794802 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/aad14a0d-3139-41e7-b2f7-77dbdab344ac-data-volume\") pod \"insights-runtime-extractor-p4ctn\" (UID: \"aad14a0d-3139-41e7-b2f7-77dbdab344ac\") " pod="openshift-insights/insights-runtime-extractor-p4ctn" Apr 24 21:29:40.794992 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.794822 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/aad14a0d-3139-41e7-b2f7-77dbdab344ac-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-p4ctn\" (UID: \"aad14a0d-3139-41e7-b2f7-77dbdab344ac\") " pod="openshift-insights/insights-runtime-extractor-p4ctn" Apr 24 21:29:40.794992 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.794897 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/aad14a0d-3139-41e7-b2f7-77dbdab344ac-crio-socket\") pod \"insights-runtime-extractor-p4ctn\" (UID: \"aad14a0d-3139-41e7-b2f7-77dbdab344ac\") " pod="openshift-insights/insights-runtime-extractor-p4ctn" Apr 24 21:29:40.794992 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.794978 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/aad14a0d-3139-41e7-b2f7-77dbdab344ac-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-p4ctn\" (UID: \"aad14a0d-3139-41e7-b2f7-77dbdab344ac\") " pod="openshift-insights/insights-runtime-extractor-p4ctn" Apr 24 21:29:40.795088 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.795026 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s29nh\" (UniqueName: \"kubernetes.io/projected/aad14a0d-3139-41e7-b2f7-77dbdab344ac-kube-api-access-s29nh\") pod \"insights-runtime-extractor-p4ctn\" (UID: \"aad14a0d-3139-41e7-b2f7-77dbdab344ac\") " pod="openshift-insights/insights-runtime-extractor-p4ctn" Apr 24 21:29:40.795233 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.795213 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/aad14a0d-3139-41e7-b2f7-77dbdab344ac-data-volume\") pod \"insights-runtime-extractor-p4ctn\" (UID: \"aad14a0d-3139-41e7-b2f7-77dbdab344ac\") " pod="openshift-insights/insights-runtime-extractor-p4ctn" Apr 24 21:29:40.795853 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.795831 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/aad14a0d-3139-41e7-b2f7-77dbdab344ac-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-p4ctn\" (UID: \"aad14a0d-3139-41e7-b2f7-77dbdab344ac\") " pod="openshift-insights/insights-runtime-extractor-p4ctn" Apr 24 21:29:40.797289 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.797269 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/aad14a0d-3139-41e7-b2f7-77dbdab344ac-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-p4ctn\" (UID: \"aad14a0d-3139-41e7-b2f7-77dbdab344ac\") " pod="openshift-insights/insights-runtime-extractor-p4ctn" Apr 24 21:29:40.820738 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.820669 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s29nh\" (UniqueName: \"kubernetes.io/projected/aad14a0d-3139-41e7-b2f7-77dbdab344ac-kube-api-access-s29nh\") pod \"insights-runtime-extractor-p4ctn\" (UID: \"aad14a0d-3139-41e7-b2f7-77dbdab344ac\") " pod="openshift-insights/insights-runtime-extractor-p4ctn" Apr 24 21:29:40.863547 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.863515 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-p4ctn" Apr 24 21:29:40.986048 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:40.986012 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-p4ctn"] Apr 24 21:29:40.990167 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:29:40.990136 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaad14a0d_3139_41e7_b2f7_77dbdab344ac.slice/crio-0061c8d983fa0acdb34cee1d27065cb8102a95c3596da22413baa32b00e9b0b5 WatchSource:0}: Error finding container 0061c8d983fa0acdb34cee1d27065cb8102a95c3596da22413baa32b00e9b0b5: Status 404 returned error can't find the container with id 0061c8d983fa0acdb34cee1d27065cb8102a95c3596da22413baa32b00e9b0b5 Apr 24 21:29:41.347544 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:41.347500 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-p4ctn" event={"ID":"aad14a0d-3139-41e7-b2f7-77dbdab344ac","Type":"ContainerStarted","Data":"a39052de16b3ec3ae8f285bc1eca1c49417777b990fa079da262971890ae9566"} Apr 24 21:29:41.347544 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:41.347540 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-p4ctn" event={"ID":"aad14a0d-3139-41e7-b2f7-77dbdab344ac","Type":"ContainerStarted","Data":"0061c8d983fa0acdb34cee1d27065cb8102a95c3596da22413baa32b00e9b0b5"} Apr 24 21:29:42.352389 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:42.352353 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-p4ctn" event={"ID":"aad14a0d-3139-41e7-b2f7-77dbdab344ac","Type":"ContainerStarted","Data":"dacb2fe1274117244309028b0363de0879d5bc1d062e62631f349c4eda4e02cf"} Apr 24 21:29:43.360759 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:43.360718 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-p4ctn" event={"ID":"aad14a0d-3139-41e7-b2f7-77dbdab344ac","Type":"ContainerStarted","Data":"b1e3ac796fd9b4bca327d612d2e58ae2764ba4f1d5dfd75aef754f86a6ae802d"} Apr 24 21:29:43.384375 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:43.384326 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-p4ctn" podStartSLOduration=1.283648188 podStartE2EDuration="3.384312571s" podCreationTimestamp="2026-04-24 21:29:40 +0000 UTC" firstStartedPulling="2026-04-24 21:29:41.041161601 +0000 UTC m=+183.813698682" lastFinishedPulling="2026-04-24 21:29:43.141825982 +0000 UTC m=+185.914363065" observedRunningTime="2026-04-24 21:29:43.383890252 +0000 UTC m=+186.156427366" watchObservedRunningTime="2026-04-24 21:29:43.384312571 +0000 UTC m=+186.156849675" Apr 24 21:29:47.756154 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.756123 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb"] Apr 24 21:29:47.759236 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.759217 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" Apr 24 21:29:47.763795 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.763773 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 21:29:47.763913 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.763783 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 21:29:47.763913 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.763795 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-krcc9\"" Apr 24 21:29:47.764020 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.763936 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:29:47.764020 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.763957 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:29:47.764020 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.763990 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:29:47.770917 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.770896 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb"] Apr 24 21:29:47.827460 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.827430 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nv7fx"] Apr 24 21:29:47.830393 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.830376 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:47.833182 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.833160 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:29:47.833434 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.833417 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:29:47.833634 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.833621 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:29:47.833706 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.833653 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-stmx8\"" Apr 24 21:29:47.849407 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.849385 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b78255af-fb17-489c-94fb-b6f694bad656-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-kj7jb\" (UID: \"b78255af-fb17-489c-94fb-b6f694bad656\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" Apr 24 21:29:47.849504 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.849420 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b78255af-fb17-489c-94fb-b6f694bad656-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kj7jb\" (UID: \"b78255af-fb17-489c-94fb-b6f694bad656\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" Apr 24 21:29:47.849504 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.849452 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b78255af-fb17-489c-94fb-b6f694bad656-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-kj7jb\" (UID: \"b78255af-fb17-489c-94fb-b6f694bad656\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" Apr 24 21:29:47.849574 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.849547 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb8kq\" (UniqueName: \"kubernetes.io/projected/b78255af-fb17-489c-94fb-b6f694bad656-kube-api-access-cb8kq\") pod \"openshift-state-metrics-9d44df66c-kj7jb\" (UID: \"b78255af-fb17-489c-94fb-b6f694bad656\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" Apr 24 21:29:47.950117 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.950066 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cb8kq\" (UniqueName: \"kubernetes.io/projected/b78255af-fb17-489c-94fb-b6f694bad656-kube-api-access-cb8kq\") pod \"openshift-state-metrics-9d44df66c-kj7jb\" (UID: \"b78255af-fb17-489c-94fb-b6f694bad656\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" Apr 24 21:29:47.950117 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.950125 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/18cd0973-265e-440e-a7e2-13e28f5fadd2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:47.950326 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.950146 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/18cd0973-265e-440e-a7e2-13e28f5fadd2-root\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:47.950326 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.950177 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/18cd0973-265e-440e-a7e2-13e28f5fadd2-node-exporter-tls\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:47.950326 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.950193 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/18cd0973-265e-440e-a7e2-13e28f5fadd2-node-exporter-accelerators-collector-config\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:47.950326 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.950218 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/18cd0973-265e-440e-a7e2-13e28f5fadd2-node-exporter-wtmp\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:47.950326 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.950248 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b78255af-fb17-489c-94fb-b6f694bad656-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-kj7jb\" (UID: \"b78255af-fb17-489c-94fb-b6f694bad656\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" Apr 24 21:29:47.950326 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.950308 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b78255af-fb17-489c-94fb-b6f694bad656-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kj7jb\" (UID: \"b78255af-fb17-489c-94fb-b6f694bad656\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" Apr 24 21:29:47.950569 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.950352 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/18cd0973-265e-440e-a7e2-13e28f5fadd2-node-exporter-textfile\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:47.950569 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.950375 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/18cd0973-265e-440e-a7e2-13e28f5fadd2-metrics-client-ca\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:47.950569 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.950399 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt66n\" (UniqueName: \"kubernetes.io/projected/18cd0973-265e-440e-a7e2-13e28f5fadd2-kube-api-access-gt66n\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:47.950569 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.950428 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b78255af-fb17-489c-94fb-b6f694bad656-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-kj7jb\" (UID: \"b78255af-fb17-489c-94fb-b6f694bad656\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" Apr 24 21:29:47.950569 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:29:47.950431 2574 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 24 21:29:47.950569 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:29:47.950498 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b78255af-fb17-489c-94fb-b6f694bad656-openshift-state-metrics-tls podName:b78255af-fb17-489c-94fb-b6f694bad656 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:48.450476959 +0000 UTC m=+191.223014053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/b78255af-fb17-489c-94fb-b6f694bad656-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-kj7jb" (UID: "b78255af-fb17-489c-94fb-b6f694bad656") : secret "openshift-state-metrics-tls" not found Apr 24 21:29:47.950569 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.950522 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18cd0973-265e-440e-a7e2-13e28f5fadd2-sys\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:47.950975 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.950958 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b78255af-fb17-489c-94fb-b6f694bad656-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-kj7jb\" (UID: \"b78255af-fb17-489c-94fb-b6f694bad656\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" Apr 24 21:29:47.952616 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.952595 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b78255af-fb17-489c-94fb-b6f694bad656-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-kj7jb\" (UID: \"b78255af-fb17-489c-94fb-b6f694bad656\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" Apr 24 21:29:47.965700 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:47.965678 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb8kq\" (UniqueName: \"kubernetes.io/projected/b78255af-fb17-489c-94fb-b6f694bad656-kube-api-access-cb8kq\") pod \"openshift-state-metrics-9d44df66c-kj7jb\" (UID: \"b78255af-fb17-489c-94fb-b6f694bad656\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" Apr 24 21:29:48.051394 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.051309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18cd0973-265e-440e-a7e2-13e28f5fadd2-sys\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.051394 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.051375 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/18cd0973-265e-440e-a7e2-13e28f5fadd2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.051394 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.051397 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/18cd0973-265e-440e-a7e2-13e28f5fadd2-root\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.051644 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.051423 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/18cd0973-265e-440e-a7e2-13e28f5fadd2-node-exporter-tls\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.051644 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.051429 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18cd0973-265e-440e-a7e2-13e28f5fadd2-sys\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.051644 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.051440 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/18cd0973-265e-440e-a7e2-13e28f5fadd2-node-exporter-accelerators-collector-config\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.051644 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.051485 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/18cd0973-265e-440e-a7e2-13e28f5fadd2-root\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.051644 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.051531 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/18cd0973-265e-440e-a7e2-13e28f5fadd2-node-exporter-wtmp\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.051644 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:29:48.051554 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:29:48.051644 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:29:48.051608 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18cd0973-265e-440e-a7e2-13e28f5fadd2-node-exporter-tls podName:18cd0973-265e-440e-a7e2-13e28f5fadd2 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:48.55159299 +0000 UTC m=+191.324130107 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/18cd0973-265e-440e-a7e2-13e28f5fadd2-node-exporter-tls") pod "node-exporter-nv7fx" (UID: "18cd0973-265e-440e-a7e2-13e28f5fadd2") : secret "node-exporter-tls" not found Apr 24 21:29:48.051944 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.051647 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/18cd0973-265e-440e-a7e2-13e28f5fadd2-node-exporter-textfile\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.051944 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.051664 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/18cd0973-265e-440e-a7e2-13e28f5fadd2-node-exporter-wtmp\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.051944 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.051673 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/18cd0973-265e-440e-a7e2-13e28f5fadd2-metrics-client-ca\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.051944 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.051691 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gt66n\" (UniqueName: \"kubernetes.io/projected/18cd0973-265e-440e-a7e2-13e28f5fadd2-kube-api-access-gt66n\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.052175 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.052050 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/18cd0973-265e-440e-a7e2-13e28f5fadd2-node-exporter-accelerators-collector-config\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.052175 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.052090 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/18cd0973-265e-440e-a7e2-13e28f5fadd2-metrics-client-ca\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.052175 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.052132 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/18cd0973-265e-440e-a7e2-13e28f5fadd2-node-exporter-textfile\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.054306 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.054278 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/18cd0973-265e-440e-a7e2-13e28f5fadd2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.065886 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.065847 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt66n\" (UniqueName: \"kubernetes.io/projected/18cd0973-265e-440e-a7e2-13e28f5fadd2-kube-api-access-gt66n\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.455453 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.455418 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b78255af-fb17-489c-94fb-b6f694bad656-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kj7jb\" (UID: \"b78255af-fb17-489c-94fb-b6f694bad656\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" Apr 24 21:29:48.457872 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.457850 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b78255af-fb17-489c-94fb-b6f694bad656-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kj7jb\" (UID: \"b78255af-fb17-489c-94fb-b6f694bad656\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" Apr 24 21:29:48.556767 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.556725 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/18cd0973-265e-440e-a7e2-13e28f5fadd2-node-exporter-tls\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.558983 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.558962 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/18cd0973-265e-440e-a7e2-13e28f5fadd2-node-exporter-tls\") pod \"node-exporter-nv7fx\" (UID: \"18cd0973-265e-440e-a7e2-13e28f5fadd2\") " pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.668201 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.668166 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" Apr 24 21:29:48.739543 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.739510 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nv7fx" Apr 24 21:29:48.747999 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:29:48.747971 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18cd0973_265e_440e_a7e2_13e28f5fadd2.slice/crio-f10c7c115364c2c8dffaf39cbcc0d40fbdccc803ae597ed98fb8b0428f79d4c2 WatchSource:0}: Error finding container f10c7c115364c2c8dffaf39cbcc0d40fbdccc803ae597ed98fb8b0428f79d4c2: Status 404 returned error can't find the container with id f10c7c115364c2c8dffaf39cbcc0d40fbdccc803ae597ed98fb8b0428f79d4c2 Apr 24 21:29:48.797746 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:48.797713 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb"] Apr 24 21:29:48.800414 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:29:48.800382 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb78255af_fb17_489c_94fb_b6f694bad656.slice/crio-b942e666694032913479727fdfa424b8132a63870916cf733ca8f8202b2e30bf WatchSource:0}: Error finding container b942e666694032913479727fdfa424b8132a63870916cf733ca8f8202b2e30bf: Status 404 returned error can't find the container with id b942e666694032913479727fdfa424b8132a63870916cf733ca8f8202b2e30bf Apr 24 21:29:49.376301 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:49.376260 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" event={"ID":"b78255af-fb17-489c-94fb-b6f694bad656","Type":"ContainerStarted","Data":"c0b825ffa4854f1bce20faacaa3e06549810a0721d7273866a720872e4c5f8c2"} Apr 24 21:29:49.376492 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:49.376315 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" event={"ID":"b78255af-fb17-489c-94fb-b6f694bad656","Type":"ContainerStarted","Data":"592a8f39b20636274720aa6ccbafcc8aa847d90855620fa279d58cfef56c3f3c"} Apr 24 21:29:49.376492 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:49.376329 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" event={"ID":"b78255af-fb17-489c-94fb-b6f694bad656","Type":"ContainerStarted","Data":"b942e666694032913479727fdfa424b8132a63870916cf733ca8f8202b2e30bf"} Apr 24 21:29:49.377504 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:49.377471 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nv7fx" event={"ID":"18cd0973-265e-440e-a7e2-13e28f5fadd2","Type":"ContainerStarted","Data":"f10c7c115364c2c8dffaf39cbcc0d40fbdccc803ae597ed98fb8b0428f79d4c2"} Apr 24 21:29:50.381453 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:50.381419 2574 generic.go:358] "Generic (PLEG): container finished" podID="18cd0973-265e-440e-a7e2-13e28f5fadd2" containerID="bbc8ff8a81619a7ded1c1737d3d79fbcb02a061f0a7097e90e658991670d2f0f" exitCode=0 Apr 24 21:29:50.381978 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:50.381508 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nv7fx" event={"ID":"18cd0973-265e-440e-a7e2-13e28f5fadd2","Type":"ContainerDied","Data":"bbc8ff8a81619a7ded1c1737d3d79fbcb02a061f0a7097e90e658991670d2f0f"} Apr 24 21:29:50.383412 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:50.383392 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" event={"ID":"b78255af-fb17-489c-94fb-b6f694bad656","Type":"ContainerStarted","Data":"bbaa00a3cd3074822a2cb11c9ceb4848878b0b46e600aeda7440fc384160ca05"} Apr 24 21:29:50.423828 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:50.423785 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kj7jb" podStartSLOduration=2.5273743250000003 podStartE2EDuration="3.423771352s" podCreationTimestamp="2026-04-24 21:29:47 +0000 UTC" firstStartedPulling="2026-04-24 21:29:48.913608533 +0000 UTC m=+191.686145613" lastFinishedPulling="2026-04-24 21:29:49.810005545 +0000 UTC m=+192.582542640" observedRunningTime="2026-04-24 21:29:50.422778371 +0000 UTC m=+193.195315472" watchObservedRunningTime="2026-04-24 21:29:50.423771352 +0000 UTC m=+193.196308455" Apr 24 21:29:51.388218 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:51.388172 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nv7fx" event={"ID":"18cd0973-265e-440e-a7e2-13e28f5fadd2","Type":"ContainerStarted","Data":"4d5b4a76de5d9ec2fe569e5777b66780967d9e85649a331d13ed17b17f8aff6b"} Apr 24 21:29:51.388218 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:51.388217 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nv7fx" event={"ID":"18cd0973-265e-440e-a7e2-13e28f5fadd2","Type":"ContainerStarted","Data":"e1924a0ef0649985965c3dbb4895feb39ba265f93ce90481a8faf706dfcf7371"} Apr 24 21:29:51.409833 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:51.409789 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nv7fx" podStartSLOduration=3.6408937 podStartE2EDuration="4.409776382s" podCreationTimestamp="2026-04-24 21:29:47 +0000 UTC" firstStartedPulling="2026-04-24 21:29:48.749626509 +0000 UTC m=+191.522163590" lastFinishedPulling="2026-04-24 21:29:49.518509192 +0000 UTC m=+192.291046272" observedRunningTime="2026-04-24 21:29:51.408687359 +0000 UTC m=+194.181224487" watchObservedRunningTime="2026-04-24 21:29:51.409776382 +0000 UTC m=+194.182313485" Apr 24 21:29:52.169327 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.169291 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6ff56797f7-48p5j"] Apr 24 21:29:52.172282 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.172266 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.186653 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.186635 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 21:29:52.188012 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.187988 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-7nuq32mdnstmt\"" Apr 24 21:29:52.188084 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.188007 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 21:29:52.188084 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.188007 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 21:29:52.188084 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.188009 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 21:29:52.188261 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.188172 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-xmc8k\"" Apr 24 21:29:52.206316 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.206291 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6ff56797f7-48p5j"] Apr 24 21:29:52.285014 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.284981 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-client-ca-bundle\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.285190 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.285045 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-secret-metrics-server-tls\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.285190 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.285122 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-secret-metrics-server-client-certs\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.285190 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.285154 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stcdx\" (UniqueName: \"kubernetes.io/projected/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-kube-api-access-stcdx\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.285190 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.285181 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.285378 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.285241 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-metrics-server-audit-profiles\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.285378 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.285325 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-audit-log\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.386458 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.386431 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-client-ca-bundle\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.386570 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.386484 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-secret-metrics-server-tls\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.386634 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.386611 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-secret-metrics-server-client-certs\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.386689 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.386659 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stcdx\" (UniqueName: \"kubernetes.io/projected/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-kube-api-access-stcdx\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.386742 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.386700 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.386742 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.386733 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-metrics-server-audit-profiles\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.386849 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.386785 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-audit-log\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.387180 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.387160 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-audit-log\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.387482 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.387455 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.387639 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.387621 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-metrics-server-audit-profiles\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.388955 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.388934 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-secret-metrics-server-tls\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.389250 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.389011 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-secret-metrics-server-client-certs\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.389345 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.389324 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-client-ca-bundle\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.395416 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.395395 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stcdx\" (UniqueName: \"kubernetes.io/projected/7eb25fc0-0010-4dbe-a36e-3fc0eb43985c-kube-api-access-stcdx\") pod \"metrics-server-6ff56797f7-48p5j\" (UID: \"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c\") " pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.480232 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.480202 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:29:52.599037 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:52.598902 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6ff56797f7-48p5j"] Apr 24 21:29:52.601441 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:29:52.601414 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7eb25fc0_0010_4dbe_a36e_3fc0eb43985c.slice/crio-032b1ae42ad829f3c29b9842f5dd526c0f9face4d5e72d9cc3f785c073e416d7 WatchSource:0}: Error finding container 032b1ae42ad829f3c29b9842f5dd526c0f9face4d5e72d9cc3f785c073e416d7: Status 404 returned error can't find the container with id 032b1ae42ad829f3c29b9842f5dd526c0f9face4d5e72d9cc3f785c073e416d7 Apr 24 21:29:53.395626 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:53.395595 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" event={"ID":"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c","Type":"ContainerStarted","Data":"032b1ae42ad829f3c29b9842f5dd526c0f9face4d5e72d9cc3f785c073e416d7"} Apr 24 21:29:54.400379 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:54.400337 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" event={"ID":"7eb25fc0-0010-4dbe-a36e-3fc0eb43985c","Type":"ContainerStarted","Data":"4df2c002c981a4e139fc9e4cda0d8d1342e9fb7dfd6a608e7da2912ba391e986"} Apr 24 21:29:54.421397 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:29:54.421347 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" podStartSLOduration=1.231706655 podStartE2EDuration="2.421334183s" podCreationTimestamp="2026-04-24 21:29:52 +0000 UTC" firstStartedPulling="2026-04-24 21:29:52.603186208 +0000 UTC m=+195.375723290" lastFinishedPulling="2026-04-24 21:29:53.792813736 +0000 UTC m=+196.565350818" observedRunningTime="2026-04-24 21:29:54.420517331 +0000 UTC m=+197.193054445" watchObservedRunningTime="2026-04-24 21:29:54.421334183 +0000 UTC m=+197.193871285" Apr 24 21:30:08.131608 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:08.131513 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" podUID="053382cc-bd8f-45e7-b716-226dc9c02245" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:30:12.480639 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:12.480596 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:30:12.481023 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:12.480653 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:30:18.132143 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:18.132086 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" podUID="053382cc-bd8f-45e7-b716-226dc9c02245" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:30:28.131735 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:28.131694 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" podUID="053382cc-bd8f-45e7-b716-226dc9c02245" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:30:28.132139 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:28.131766 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" Apr 24 21:30:28.132276 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:28.132246 2574 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"e0219301e1bbfea652d92541b7917322c14dde458b4a3b5e76a7e239718eafc7"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 21:30:28.132322 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:28.132308 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" podUID="053382cc-bd8f-45e7-b716-226dc9c02245" containerName="service-proxy" containerID="cri-o://e0219301e1bbfea652d92541b7917322c14dde458b4a3b5e76a7e239718eafc7" gracePeriod=30 Apr 24 21:30:28.484542 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:28.484507 2574 generic.go:358] "Generic (PLEG): container finished" podID="053382cc-bd8f-45e7-b716-226dc9c02245" containerID="e0219301e1bbfea652d92541b7917322c14dde458b4a3b5e76a7e239718eafc7" exitCode=2 Apr 24 21:30:28.484734 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:28.484549 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" event={"ID":"053382cc-bd8f-45e7-b716-226dc9c02245","Type":"ContainerDied","Data":"e0219301e1bbfea652d92541b7917322c14dde458b4a3b5e76a7e239718eafc7"} Apr 24 21:30:28.484734 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:28.484578 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f584d4588-5rfcp" event={"ID":"053382cc-bd8f-45e7-b716-226dc9c02245","Type":"ContainerStarted","Data":"5b6a8d3b823dcad651b4e4ab4f6bd7f5433b6ba104d74227fb2b9dd8acee9305"} Apr 24 21:30:32.485898 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:32.485868 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:30:32.489878 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:32.489857 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6ff56797f7-48p5j" Apr 24 21:30:38.079199 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:38.079168 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6jctc_3fe6ee90-5f50-4532-8f34-d91e4dc1fccd/dns-node-resolver/0.log" Apr 24 21:30:49.526903 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:49.526853 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs\") pod \"network-metrics-daemon-c4ck8\" (UID: \"52e9a5f8-832a-4f1e-add3-f10bf674757e\") " pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:30:49.529185 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:49.529155 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/52e9a5f8-832a-4f1e-add3-f10bf674757e-metrics-certs\") pod \"network-metrics-daemon-c4ck8\" (UID: \"52e9a5f8-832a-4f1e-add3-f10bf674757e\") " pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:30:49.684226 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:49.684194 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5kts7\"" Apr 24 21:30:49.692622 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:49.692162 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4ck8" Apr 24 21:30:49.839509 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:49.839483 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c4ck8"] Apr 24 21:30:49.841988 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:30:49.841962 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52e9a5f8_832a_4f1e_add3_f10bf674757e.slice/crio-f0f2a51ed70d4b8837170965dae84c9aaefc8a2b9861874479a09737522deb52 WatchSource:0}: Error finding container f0f2a51ed70d4b8837170965dae84c9aaefc8a2b9861874479a09737522deb52: Status 404 returned error can't find the container with id f0f2a51ed70d4b8837170965dae84c9aaefc8a2b9861874479a09737522deb52 Apr 24 21:30:50.547160 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:50.547126 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c4ck8" event={"ID":"52e9a5f8-832a-4f1e-add3-f10bf674757e","Type":"ContainerStarted","Data":"f0f2a51ed70d4b8837170965dae84c9aaefc8a2b9861874479a09737522deb52"} Apr 24 21:30:51.551813 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:51.551774 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c4ck8" event={"ID":"52e9a5f8-832a-4f1e-add3-f10bf674757e","Type":"ContainerStarted","Data":"2867cb9fd8cc50f80578a1e2802d97094ba6c892b7c97cac10746fdaa107be80"} Apr 24 21:30:51.551813 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:30:51.551810 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c4ck8" event={"ID":"52e9a5f8-832a-4f1e-add3-f10bf674757e","Type":"ContainerStarted","Data":"67254a939f997e2c6316437afd0b5c61a3dc015e8f990373e2d6d429672f0ed3"} Apr 24 21:31:17.280177 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:31:17.280134 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-2xr7h" podUID="59fef02e-c780-4d6e-a4b6-1ffe904c5a5a" Apr 24 21:31:17.624056 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:17.623970 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:31:20.572827 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:20.572794 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert\") pod \"ingress-canary-2xr7h\" (UID: \"59fef02e-c780-4d6e-a4b6-1ffe904c5a5a\") " pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:31:20.575202 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:20.575181 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59fef02e-c780-4d6e-a4b6-1ffe904c5a5a-cert\") pod \"ingress-canary-2xr7h\" (UID: \"59fef02e-c780-4d6e-a4b6-1ffe904c5a5a\") " pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:31:20.628045 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:20.628014 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-42hwh\"" Apr 24 21:31:20.635450 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:20.635429 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2xr7h" Apr 24 21:31:20.673496 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:20.673465 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:31:20.675695 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:20.675673 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/629df89e-192f-4942-ba14-4cb4f95cef70-metrics-tls\") pod \"dns-default-w7kpc\" (UID: \"629df89e-192f-4942-ba14-4cb4f95cef70\") " pod="openshift-dns/dns-default-w7kpc" Apr 24 21:31:20.682628 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:20.682570 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mzt4d\"" Apr 24 21:31:20.690324 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:20.690058 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w7kpc" Apr 24 21:31:20.762939 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:20.762892 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-c4ck8" podStartSLOduration=282.792398425 podStartE2EDuration="4m43.76287735s" podCreationTimestamp="2026-04-24 21:26:37 +0000 UTC" firstStartedPulling="2026-04-24 21:30:49.84403273 +0000 UTC m=+252.616569811" lastFinishedPulling="2026-04-24 21:30:50.814511653 +0000 UTC m=+253.587048736" observedRunningTime="2026-04-24 21:30:51.576141594 +0000 UTC m=+254.348678698" watchObservedRunningTime="2026-04-24 21:31:20.76287735 +0000 UTC m=+283.535414454" Apr 24 21:31:20.763776 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:20.763755 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2xr7h"] Apr 24 21:31:20.768180 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:31:20.768093 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59fef02e_c780_4d6e_a4b6_1ffe904c5a5a.slice/crio-a98b9871c73e3c169866f18f8c68adadb729c80bd99d2d3f78e16803e1dec8af WatchSource:0}: Error finding container a98b9871c73e3c169866f18f8c68adadb729c80bd99d2d3f78e16803e1dec8af: Status 404 returned error can't find the container with id a98b9871c73e3c169866f18f8c68adadb729c80bd99d2d3f78e16803e1dec8af Apr 24 21:31:20.826816 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:20.826753 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w7kpc"] Apr 24 21:31:20.829960 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:31:20.829935 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod629df89e_192f_4942_ba14_4cb4f95cef70.slice/crio-a385f44e6c67783a568f8c5e917630b490f45eab91b9f2629a09bcb12125396f WatchSource:0}: Error finding container a385f44e6c67783a568f8c5e917630b490f45eab91b9f2629a09bcb12125396f: Status 404 returned error can't find the container with id a385f44e6c67783a568f8c5e917630b490f45eab91b9f2629a09bcb12125396f Apr 24 21:31:21.635636 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:21.635602 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w7kpc" event={"ID":"629df89e-192f-4942-ba14-4cb4f95cef70","Type":"ContainerStarted","Data":"a385f44e6c67783a568f8c5e917630b490f45eab91b9f2629a09bcb12125396f"} Apr 24 21:31:21.637041 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:21.637008 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2xr7h" event={"ID":"59fef02e-c780-4d6e-a4b6-1ffe904c5a5a","Type":"ContainerStarted","Data":"a98b9871c73e3c169866f18f8c68adadb729c80bd99d2d3f78e16803e1dec8af"} Apr 24 21:31:23.646783 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:23.646746 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w7kpc" event={"ID":"629df89e-192f-4942-ba14-4cb4f95cef70","Type":"ContainerStarted","Data":"1d470fc19b1abde88f627ff7c8ae35078da6f3040d363c4751f44b4232d2dd9d"} Apr 24 21:31:23.646783 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:23.646781 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w7kpc" event={"ID":"629df89e-192f-4942-ba14-4cb4f95cef70","Type":"ContainerStarted","Data":"d885cab2920c45bff7dc7fffa741d3e151f1430e89c253f773d2832fcb69955c"} Apr 24 21:31:23.647279 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:23.647002 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-w7kpc" Apr 24 21:31:23.648048 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:23.648023 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2xr7h" event={"ID":"59fef02e-c780-4d6e-a4b6-1ffe904c5a5a","Type":"ContainerStarted","Data":"cf8d141a1167968bc80ec8bb3c3f4986866c0706c6514474ab6e1d9c87eb2e0e"} Apr 24 21:31:23.696777 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:23.696410 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2xr7h" podStartSLOduration=251.884976546 podStartE2EDuration="4m13.696393363s" podCreationTimestamp="2026-04-24 21:27:10 +0000 UTC" firstStartedPulling="2026-04-24 21:31:20.770848073 +0000 UTC m=+283.543385155" lastFinishedPulling="2026-04-24 21:31:22.582264878 +0000 UTC m=+285.354801972" observedRunningTime="2026-04-24 21:31:23.69390408 +0000 UTC m=+286.466441183" watchObservedRunningTime="2026-04-24 21:31:23.696393363 +0000 UTC m=+286.468930469" Apr 24 21:31:23.697036 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:23.696948 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-w7kpc" podStartSLOduration=251.950270814 podStartE2EDuration="4m13.696940795s" podCreationTimestamp="2026-04-24 21:27:10 +0000 UTC" firstStartedPulling="2026-04-24 21:31:20.831694137 +0000 UTC m=+283.604231218" lastFinishedPulling="2026-04-24 21:31:22.578364117 +0000 UTC m=+285.350901199" observedRunningTime="2026-04-24 21:31:23.674167267 +0000 UTC m=+286.446704370" watchObservedRunningTime="2026-04-24 21:31:23.696940795 +0000 UTC m=+286.469477897" Apr 24 21:31:33.652680 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:33.652647 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-w7kpc" Apr 24 21:31:37.658334 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:37.658305 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 21:31:37.658817 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:37.658559 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 21:31:37.663293 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:31:37.663270 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:34:06.440263 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:06.440231 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf"] Apr 24 21:34:06.443149 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:06.443132 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf" Apr 24 21:34:06.450931 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:06.450910 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 21:34:06.451172 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:06.451156 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-sm92j\"" Apr 24 21:34:06.451263 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:06.451175 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 21:34:06.457092 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:06.457075 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 21:34:06.459694 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:06.459676 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf"] Apr 24 21:34:06.494917 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:06.494893 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/31d388b9-e88d-4758-9334-cb48df81bdb9-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf\" (UID: \"31d388b9-e88d-4758-9334-cb48df81bdb9\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf" Apr 24 21:34:06.495063 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:06.495001 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zthhv\" (UniqueName: \"kubernetes.io/projected/31d388b9-e88d-4758-9334-cb48df81bdb9-kube-api-access-zthhv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf\" (UID: \"31d388b9-e88d-4758-9334-cb48df81bdb9\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf" Apr 24 21:34:06.596413 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:06.596378 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/31d388b9-e88d-4758-9334-cb48df81bdb9-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf\" (UID: \"31d388b9-e88d-4758-9334-cb48df81bdb9\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf" Apr 24 21:34:06.596580 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:06.596436 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zthhv\" (UniqueName: \"kubernetes.io/projected/31d388b9-e88d-4758-9334-cb48df81bdb9-kube-api-access-zthhv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf\" (UID: \"31d388b9-e88d-4758-9334-cb48df81bdb9\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf" Apr 24 21:34:06.598721 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:06.598699 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/31d388b9-e88d-4758-9334-cb48df81bdb9-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf\" (UID: \"31d388b9-e88d-4758-9334-cb48df81bdb9\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf" Apr 24 21:34:06.614572 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:06.614546 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zthhv\" (UniqueName: \"kubernetes.io/projected/31d388b9-e88d-4758-9334-cb48df81bdb9-kube-api-access-zthhv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf\" (UID: \"31d388b9-e88d-4758-9334-cb48df81bdb9\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf" Apr 24 21:34:06.753084 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:06.753060 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf" Apr 24 21:34:06.902546 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:06.902516 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf"] Apr 24 21:34:06.906522 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:34:06.906493 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31d388b9_e88d_4758_9334_cb48df81bdb9.slice/crio-9ea0e4b2f04c4b64994f9b319b7b0a0d91260da3859f655efada1b868dd26317 WatchSource:0}: Error finding container 9ea0e4b2f04c4b64994f9b319b7b0a0d91260da3859f655efada1b868dd26317: Status 404 returned error can't find the container with id 9ea0e4b2f04c4b64994f9b319b7b0a0d91260da3859f655efada1b868dd26317 Apr 24 21:34:06.908916 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:06.908898 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:34:07.053651 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:07.053565 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf" event={"ID":"31d388b9-e88d-4758-9334-cb48df81bdb9","Type":"ContainerStarted","Data":"9ea0e4b2f04c4b64994f9b319b7b0a0d91260da3859f655efada1b868dd26317"} Apr 24 21:34:11.067815 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:11.067782 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf" event={"ID":"31d388b9-e88d-4758-9334-cb48df81bdb9","Type":"ContainerStarted","Data":"fe033a3c6765e2e734ea967f580a451ec10d4535d323a5cb757a73a13806d120"} Apr 24 21:34:11.068212 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:11.067930 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf" Apr 24 21:34:11.119607 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:11.119557 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf" podStartSLOduration=1.5307168180000001 podStartE2EDuration="5.119542179s" podCreationTimestamp="2026-04-24 21:34:06 +0000 UTC" firstStartedPulling="2026-04-24 21:34:06.909078975 +0000 UTC m=+449.681616071" lastFinishedPulling="2026-04-24 21:34:10.49790435 +0000 UTC m=+453.270441432" observedRunningTime="2026-04-24 21:34:11.119459422 +0000 UTC m=+453.891996526" watchObservedRunningTime="2026-04-24 21:34:11.119542179 +0000 UTC m=+453.892079282" Apr 24 21:34:12.045004 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.044968 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx"] Apr 24 21:34:12.047955 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.047939 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" Apr 24 21:34:12.059606 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.059587 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 21:34:12.059710 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.059606 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 21:34:12.060538 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.060524 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-nm9wf\"" Apr 24 21:34:12.068336 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.068310 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx"] Apr 24 21:34:12.143398 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.143366 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dbe40c03-0d25-4010-92e5-d025d1651a3e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-r8bmx\" (UID: \"dbe40c03-0d25-4010-92e5-d025d1651a3e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" Apr 24 21:34:12.143398 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.143404 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj7rv\" (UniqueName: \"kubernetes.io/projected/dbe40c03-0d25-4010-92e5-d025d1651a3e-kube-api-access-tj7rv\") pod \"keda-metrics-apiserver-7c9f485588-r8bmx\" (UID: \"dbe40c03-0d25-4010-92e5-d025d1651a3e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" Apr 24 21:34:12.143637 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.143506 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/dbe40c03-0d25-4010-92e5-d025d1651a3e-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-r8bmx\" (UID: \"dbe40c03-0d25-4010-92e5-d025d1651a3e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" Apr 24 21:34:12.244245 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.244210 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tj7rv\" (UniqueName: \"kubernetes.io/projected/dbe40c03-0d25-4010-92e5-d025d1651a3e-kube-api-access-tj7rv\") pod \"keda-metrics-apiserver-7c9f485588-r8bmx\" (UID: \"dbe40c03-0d25-4010-92e5-d025d1651a3e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" Apr 24 21:34:12.244418 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.244270 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/dbe40c03-0d25-4010-92e5-d025d1651a3e-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-r8bmx\" (UID: \"dbe40c03-0d25-4010-92e5-d025d1651a3e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" Apr 24 21:34:12.244418 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.244309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dbe40c03-0d25-4010-92e5-d025d1651a3e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-r8bmx\" (UID: \"dbe40c03-0d25-4010-92e5-d025d1651a3e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" Apr 24 21:34:12.244418 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:34:12.244396 2574 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:34:12.244418 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:34:12.244409 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:34:12.244607 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:34:12.244424 2574 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 24 21:34:12.244607 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:34:12.244441 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 21:34:12.244607 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:34:12.244495 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dbe40c03-0d25-4010-92e5-d025d1651a3e-certificates podName:dbe40c03-0d25-4010-92e5-d025d1651a3e nodeName:}" failed. No retries permitted until 2026-04-24 21:34:12.7444761 +0000 UTC m=+455.517013199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dbe40c03-0d25-4010-92e5-d025d1651a3e-certificates") pod "keda-metrics-apiserver-7c9f485588-r8bmx" (UID: "dbe40c03-0d25-4010-92e5-d025d1651a3e") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 21:34:12.244712 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.244662 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/dbe40c03-0d25-4010-92e5-d025d1651a3e-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-r8bmx\" (UID: \"dbe40c03-0d25-4010-92e5-d025d1651a3e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" Apr 24 21:34:12.271423 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.271394 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj7rv\" (UniqueName: \"kubernetes.io/projected/dbe40c03-0d25-4010-92e5-d025d1651a3e-kube-api-access-tj7rv\") pod \"keda-metrics-apiserver-7c9f485588-r8bmx\" (UID: \"dbe40c03-0d25-4010-92e5-d025d1651a3e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" Apr 24 21:34:12.317418 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.317348 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-6n8hn"] Apr 24 21:34:12.320616 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.320595 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6n8hn" Apr 24 21:34:12.323568 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.323549 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 21:34:12.336428 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.336405 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6n8hn"] Apr 24 21:34:12.446209 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.446174 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/681b635c-ef3f-4fea-a840-5f091630a0e5-certificates\") pod \"keda-admission-cf49989db-6n8hn\" (UID: \"681b635c-ef3f-4fea-a840-5f091630a0e5\") " pod="openshift-keda/keda-admission-cf49989db-6n8hn" Apr 24 21:34:12.446367 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.446228 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz4tr\" (UniqueName: \"kubernetes.io/projected/681b635c-ef3f-4fea-a840-5f091630a0e5-kube-api-access-cz4tr\") pod \"keda-admission-cf49989db-6n8hn\" (UID: \"681b635c-ef3f-4fea-a840-5f091630a0e5\") " pod="openshift-keda/keda-admission-cf49989db-6n8hn" Apr 24 21:34:12.547048 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.547009 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/681b635c-ef3f-4fea-a840-5f091630a0e5-certificates\") pod \"keda-admission-cf49989db-6n8hn\" (UID: \"681b635c-ef3f-4fea-a840-5f091630a0e5\") " pod="openshift-keda/keda-admission-cf49989db-6n8hn" Apr 24 21:34:12.547218 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.547056 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz4tr\" (UniqueName: \"kubernetes.io/projected/681b635c-ef3f-4fea-a840-5f091630a0e5-kube-api-access-cz4tr\") pod \"keda-admission-cf49989db-6n8hn\" (UID: \"681b635c-ef3f-4fea-a840-5f091630a0e5\") " pod="openshift-keda/keda-admission-cf49989db-6n8hn" Apr 24 21:34:12.549416 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.549397 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/681b635c-ef3f-4fea-a840-5f091630a0e5-certificates\") pod \"keda-admission-cf49989db-6n8hn\" (UID: \"681b635c-ef3f-4fea-a840-5f091630a0e5\") " pod="openshift-keda/keda-admission-cf49989db-6n8hn" Apr 24 21:34:12.562964 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.562939 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz4tr\" (UniqueName: \"kubernetes.io/projected/681b635c-ef3f-4fea-a840-5f091630a0e5-kube-api-access-cz4tr\") pod \"keda-admission-cf49989db-6n8hn\" (UID: \"681b635c-ef3f-4fea-a840-5f091630a0e5\") " pod="openshift-keda/keda-admission-cf49989db-6n8hn" Apr 24 21:34:12.630364 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.630289 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6n8hn" Apr 24 21:34:12.748580 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.748551 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dbe40c03-0d25-4010-92e5-d025d1651a3e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-r8bmx\" (UID: \"dbe40c03-0d25-4010-92e5-d025d1651a3e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" Apr 24 21:34:12.748734 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:34:12.748692 2574 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:34:12.748734 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:34:12.748710 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:34:12.748734 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:34:12.748726 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx: references non-existent secret key: tls.crt Apr 24 21:34:12.748834 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:34:12.748791 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dbe40c03-0d25-4010-92e5-d025d1651a3e-certificates podName:dbe40c03-0d25-4010-92e5-d025d1651a3e nodeName:}" failed. No retries permitted until 2026-04-24 21:34:13.748776486 +0000 UTC m=+456.521313568 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dbe40c03-0d25-4010-92e5-d025d1651a3e-certificates") pod "keda-metrics-apiserver-7c9f485588-r8bmx" (UID: "dbe40c03-0d25-4010-92e5-d025d1651a3e") : references non-existent secret key: tls.crt Apr 24 21:34:12.769416 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:12.769395 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6n8hn"] Apr 24 21:34:12.771602 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:34:12.771577 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod681b635c_ef3f_4fea_a840_5f091630a0e5.slice/crio-7be7525bf98bd014167e4a4380cfa574023fb74663bcba8fca942776e0674d6a WatchSource:0}: Error finding container 7be7525bf98bd014167e4a4380cfa574023fb74663bcba8fca942776e0674d6a: Status 404 returned error can't find the container with id 7be7525bf98bd014167e4a4380cfa574023fb74663bcba8fca942776e0674d6a Apr 24 21:34:13.074376 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:13.074338 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6n8hn" event={"ID":"681b635c-ef3f-4fea-a840-5f091630a0e5","Type":"ContainerStarted","Data":"7be7525bf98bd014167e4a4380cfa574023fb74663bcba8fca942776e0674d6a"} Apr 24 21:34:13.756188 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:13.756156 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dbe40c03-0d25-4010-92e5-d025d1651a3e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-r8bmx\" (UID: \"dbe40c03-0d25-4010-92e5-d025d1651a3e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" Apr 24 21:34:13.756352 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:34:13.756316 2574 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:34:13.756352 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:34:13.756333 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:34:13.756352 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:34:13.756353 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx: references non-existent secret key: tls.crt Apr 24 21:34:13.756498 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:34:13.756405 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dbe40c03-0d25-4010-92e5-d025d1651a3e-certificates podName:dbe40c03-0d25-4010-92e5-d025d1651a3e nodeName:}" failed. No retries permitted until 2026-04-24 21:34:15.756389857 +0000 UTC m=+458.528926950 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dbe40c03-0d25-4010-92e5-d025d1651a3e-certificates") pod "keda-metrics-apiserver-7c9f485588-r8bmx" (UID: "dbe40c03-0d25-4010-92e5-d025d1651a3e") : references non-existent secret key: tls.crt Apr 24 21:34:15.081989 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:15.081954 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6n8hn" event={"ID":"681b635c-ef3f-4fea-a840-5f091630a0e5","Type":"ContainerStarted","Data":"a5f037a9dca981ad6d9d339a78a07192a50ccc6011210081a7f7eff60476909d"} Apr 24 21:34:15.082440 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:15.082069 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-6n8hn" Apr 24 21:34:15.102217 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:15.102173 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-6n8hn" podStartSLOduration=1.8023713730000002 podStartE2EDuration="3.102155093s" podCreationTimestamp="2026-04-24 21:34:12 +0000 UTC" firstStartedPulling="2026-04-24 21:34:12.772828735 +0000 UTC m=+455.545365816" lastFinishedPulling="2026-04-24 21:34:14.07261244 +0000 UTC m=+456.845149536" observedRunningTime="2026-04-24 21:34:15.10060652 +0000 UTC m=+457.873143626" watchObservedRunningTime="2026-04-24 21:34:15.102155093 +0000 UTC m=+457.874692195" Apr 24 21:34:15.771713 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:15.771679 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dbe40c03-0d25-4010-92e5-d025d1651a3e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-r8bmx\" (UID: \"dbe40c03-0d25-4010-92e5-d025d1651a3e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" Apr 24 21:34:15.774004 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:15.773985 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dbe40c03-0d25-4010-92e5-d025d1651a3e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-r8bmx\" (UID: \"dbe40c03-0d25-4010-92e5-d025d1651a3e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" Apr 24 21:34:15.957742 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:15.957704 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" Apr 24 21:34:16.070188 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:16.070156 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx"] Apr 24 21:34:16.072872 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:34:16.072827 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbe40c03_0d25_4010_92e5_d025d1651a3e.slice/crio-77e2a17d6928a5dfe1e272d5e783ea140d5992f594b35e7d2482303f2bd63941 WatchSource:0}: Error finding container 77e2a17d6928a5dfe1e272d5e783ea140d5992f594b35e7d2482303f2bd63941: Status 404 returned error can't find the container with id 77e2a17d6928a5dfe1e272d5e783ea140d5992f594b35e7d2482303f2bd63941 Apr 24 21:34:16.085718 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:16.085689 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" event={"ID":"dbe40c03-0d25-4010-92e5-d025d1651a3e","Type":"ContainerStarted","Data":"77e2a17d6928a5dfe1e272d5e783ea140d5992f594b35e7d2482303f2bd63941"} Apr 24 21:34:19.095669 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:19.095631 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" event={"ID":"dbe40c03-0d25-4010-92e5-d025d1651a3e","Type":"ContainerStarted","Data":"e9bce676216127b9e7f67f792626ac482d760d100791cdef480de4a2ddafc001"} Apr 24 21:34:19.096042 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:19.095840 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" Apr 24 21:34:19.117197 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:19.117154 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" podStartSLOduration=5.6641329129999995 podStartE2EDuration="8.117141203s" podCreationTimestamp="2026-04-24 21:34:11 +0000 UTC" firstStartedPulling="2026-04-24 21:34:16.074209606 +0000 UTC m=+458.846746687" lastFinishedPulling="2026-04-24 21:34:18.527217878 +0000 UTC m=+461.299754977" observedRunningTime="2026-04-24 21:34:19.115776914 +0000 UTC m=+461.888314019" watchObservedRunningTime="2026-04-24 21:34:19.117141203 +0000 UTC m=+461.889678309" Apr 24 21:34:30.103520 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:30.103489 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-r8bmx" Apr 24 21:34:32.073301 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:32.073271 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-4v8bf" Apr 24 21:34:36.087845 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:34:36.087816 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-6n8hn" Apr 24 21:35:16.859424 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:16.859354 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-fksh8"] Apr 24 21:35:16.862482 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:16.862463 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fksh8" Apr 24 21:35:16.868791 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:16.868771 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 21:35:16.868899 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:16.868794 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:35:16.868899 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:16.868807 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-5mppp\"" Apr 24 21:35:16.869764 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:16.869747 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:35:16.878448 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:16.878429 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-fksh8"] Apr 24 21:35:16.899890 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:16.899865 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w44p6\" (UniqueName: \"kubernetes.io/projected/6b5780f0-47fe-431a-bc89-537580f83a52-kube-api-access-w44p6\") pod \"llmisvc-controller-manager-68cc5db7c4-fksh8\" (UID: \"6b5780f0-47fe-431a-bc89-537580f83a52\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fksh8" Apr 24 21:35:16.899986 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:16.899908 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b5780f0-47fe-431a-bc89-537580f83a52-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-fksh8\" (UID: \"6b5780f0-47fe-431a-bc89-537580f83a52\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fksh8" Apr 24 21:35:17.000955 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:17.000927 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b5780f0-47fe-431a-bc89-537580f83a52-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-fksh8\" (UID: \"6b5780f0-47fe-431a-bc89-537580f83a52\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fksh8" Apr 24 21:35:17.001115 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:17.001000 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w44p6\" (UniqueName: \"kubernetes.io/projected/6b5780f0-47fe-431a-bc89-537580f83a52-kube-api-access-w44p6\") pod \"llmisvc-controller-manager-68cc5db7c4-fksh8\" (UID: \"6b5780f0-47fe-431a-bc89-537580f83a52\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fksh8" Apr 24 21:35:17.003315 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:17.003293 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b5780f0-47fe-431a-bc89-537580f83a52-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-fksh8\" (UID: \"6b5780f0-47fe-431a-bc89-537580f83a52\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fksh8" Apr 24 21:35:17.025073 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:17.025054 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w44p6\" (UniqueName: \"kubernetes.io/projected/6b5780f0-47fe-431a-bc89-537580f83a52-kube-api-access-w44p6\") pod \"llmisvc-controller-manager-68cc5db7c4-fksh8\" (UID: \"6b5780f0-47fe-431a-bc89-537580f83a52\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fksh8" Apr 24 21:35:17.172013 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:17.171941 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fksh8" Apr 24 21:35:17.299553 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:17.299525 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-fksh8"] Apr 24 21:35:18.256980 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:18.256948 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fksh8" event={"ID":"6b5780f0-47fe-431a-bc89-537580f83a52","Type":"ContainerStarted","Data":"dcf28159912e72d8a628e5f2bda785c9e312f9e360cd942fc764f13e091b3baf"} Apr 24 21:35:19.260988 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:19.260952 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fksh8" event={"ID":"6b5780f0-47fe-431a-bc89-537580f83a52","Type":"ContainerStarted","Data":"ca74360b3cdb7e56b715a6c747f5cbc8002cfe69a8264c784ae32467285554d4"} Apr 24 21:35:19.261314 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:19.261070 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fksh8" Apr 24 21:35:19.279673 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:19.279629 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fksh8" podStartSLOduration=1.447733655 podStartE2EDuration="3.279617562s" podCreationTimestamp="2026-04-24 21:35:16 +0000 UTC" firstStartedPulling="2026-04-24 21:35:17.305296336 +0000 UTC m=+520.077833430" lastFinishedPulling="2026-04-24 21:35:19.137180251 +0000 UTC m=+521.909717337" observedRunningTime="2026-04-24 21:35:19.277454546 +0000 UTC m=+522.049991650" watchObservedRunningTime="2026-04-24 21:35:19.279617562 +0000 UTC m=+522.052154690" Apr 24 21:35:50.265969 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:35:50.265891 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fksh8" Apr 24 21:36:37.675873 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:36:37.675841 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 21:36:37.677129 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:36:37.677070 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 21:37:00.919242 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:00.919206 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh"] Apr 24 21:37:00.922512 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:00.922488 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:37:00.926136 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:00.926083 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\"" Apr 24 21:37:00.926136 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:00.926130 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:37:00.926344 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:00.926117 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-predictor-serving-cert\"" Apr 24 21:37:00.926344 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:00.926204 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-6ztvm\"" Apr 24 21:37:00.926344 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:00.926219 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:37:00.933231 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:00.933210 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh"] Apr 24 21:37:00.943447 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:00.943426 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-79868c757f-zc9fh\" (UID: \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:37:00.943572 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:00.943461 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-79868c757f-zc9fh\" (UID: \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:37:00.943572 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:00.943506 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57pd4\" (UniqueName: \"kubernetes.io/projected/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-kube-api-access-57pd4\") pod \"isvc-sklearn-graph-1-predictor-79868c757f-zc9fh\" (UID: \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:37:00.943663 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:00.943590 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-79868c757f-zc9fh\" (UID: \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:37:01.044037 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:01.044008 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-79868c757f-zc9fh\" (UID: \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:37:01.044203 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:01.044043 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-79868c757f-zc9fh\" (UID: \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:37:01.044274 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:01.044248 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57pd4\" (UniqueName: \"kubernetes.io/projected/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-kube-api-access-57pd4\") pod \"isvc-sklearn-graph-1-predictor-79868c757f-zc9fh\" (UID: \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:37:01.044335 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:01.044315 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-79868c757f-zc9fh\" (UID: \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:37:01.044670 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:01.044647 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-79868c757f-zc9fh\" (UID: \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:37:01.044754 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:01.044698 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-79868c757f-zc9fh\" (UID: \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:37:01.046537 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:01.046517 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-79868c757f-zc9fh\" (UID: \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:37:01.052545 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:01.052526 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57pd4\" (UniqueName: \"kubernetes.io/projected/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-kube-api-access-57pd4\") pod \"isvc-sklearn-graph-1-predictor-79868c757f-zc9fh\" (UID: \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:37:01.233671 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:01.233643 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:37:01.363226 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:01.363190 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh"] Apr 24 21:37:01.366073 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:37:01.366044 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4bdf6b1_e39b_4813_8d7c_e4b8a0bdafb3.slice/crio-240605e5be8fd4ec24fdb82848120f909b7aede6d422f59948614a9e489e0f4a WatchSource:0}: Error finding container 240605e5be8fd4ec24fdb82848120f909b7aede6d422f59948614a9e489e0f4a: Status 404 returned error can't find the container with id 240605e5be8fd4ec24fdb82848120f909b7aede6d422f59948614a9e489e0f4a Apr 24 21:37:01.535410 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:01.535335 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" event={"ID":"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3","Type":"ContainerStarted","Data":"240605e5be8fd4ec24fdb82848120f909b7aede6d422f59948614a9e489e0f4a"} Apr 24 21:37:02.038602 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.038215 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8"] Apr 24 21:37:02.045457 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.044001 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:37:02.046894 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.046869 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-predictor-serving-cert\"" Apr 24 21:37:02.048137 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.047851 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\"" Apr 24 21:37:02.053001 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.052961 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8"] Apr 24 21:37:02.152285 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.152226 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mp9m\" (UniqueName: \"kubernetes.io/projected/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-kube-api-access-7mp9m\") pod \"isvc-sklearn-graph-2-predictor-7758df598f-nldq8\" (UID: \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:37:02.152285 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.152281 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-7758df598f-nldq8\" (UID: \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:37:02.152580 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.152351 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-7758df598f-nldq8\" (UID: \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:37:02.152580 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.152434 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7758df598f-nldq8\" (UID: \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:37:02.254195 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.253695 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mp9m\" (UniqueName: \"kubernetes.io/projected/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-kube-api-access-7mp9m\") pod \"isvc-sklearn-graph-2-predictor-7758df598f-nldq8\" (UID: \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:37:02.254195 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.253752 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-7758df598f-nldq8\" (UID: \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:37:02.254195 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.253824 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-7758df598f-nldq8\" (UID: \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:37:02.254195 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.253877 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7758df598f-nldq8\" (UID: \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:37:02.254571 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.254458 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7758df598f-nldq8\" (UID: \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:37:02.254912 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.254887 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-7758df598f-nldq8\" (UID: \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:37:02.257504 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.257475 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-7758df598f-nldq8\" (UID: \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:37:02.263916 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.263852 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mp9m\" (UniqueName: \"kubernetes.io/projected/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-kube-api-access-7mp9m\") pod \"isvc-sklearn-graph-2-predictor-7758df598f-nldq8\" (UID: \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:37:02.363201 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.363126 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:37:02.514527 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:02.514460 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8"] Apr 24 21:37:04.521061 ip-10-0-133-209 kubenswrapper[2574]: W0424 21:37:04.521028 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8e80edf_1330_461d_8e2e_52e2cf2c9b92.slice/crio-04b0fa6172830dc72710725210e78dc31dd7331fb3b79ccca76fd7058b2b716f WatchSource:0}: Error finding container 04b0fa6172830dc72710725210e78dc31dd7331fb3b79ccca76fd7058b2b716f: Status 404 returned error can't find the container with id 04b0fa6172830dc72710725210e78dc31dd7331fb3b79ccca76fd7058b2b716f Apr 24 21:37:04.546174 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:04.546137 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" event={"ID":"f8e80edf-1330-461d-8e2e-52e2cf2c9b92","Type":"ContainerStarted","Data":"04b0fa6172830dc72710725210e78dc31dd7331fb3b79ccca76fd7058b2b716f"} Apr 24 21:37:05.550712 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:05.550666 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" event={"ID":"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3","Type":"ContainerStarted","Data":"2150d7e2a353b1c9cc75da136184ef6aaab12116f41acde87476203f9955479e"} Apr 24 21:37:05.551942 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:05.551918 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" event={"ID":"f8e80edf-1330-461d-8e2e-52e2cf2c9b92","Type":"ContainerStarted","Data":"68435c6e92449cc46869795d0d95258d4f790fe37bc1e3497bc64f8fbc80d020"} Apr 24 21:37:08.561679 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:08.561648 2574 generic.go:358] "Generic (PLEG): container finished" podID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerID="2150d7e2a353b1c9cc75da136184ef6aaab12116f41acde87476203f9955479e" exitCode=0 Apr 24 21:37:08.562021 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:08.561729 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" event={"ID":"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3","Type":"ContainerDied","Data":"2150d7e2a353b1c9cc75da136184ef6aaab12116f41acde87476203f9955479e"} Apr 24 21:37:09.567949 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:09.567908 2574 generic.go:358] "Generic (PLEG): container finished" podID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerID="68435c6e92449cc46869795d0d95258d4f790fe37bc1e3497bc64f8fbc80d020" exitCode=0 Apr 24 21:37:09.568410 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:09.568010 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" event={"ID":"f8e80edf-1330-461d-8e2e-52e2cf2c9b92","Type":"ContainerDied","Data":"68435c6e92449cc46869795d0d95258d4f790fe37bc1e3497bc64f8fbc80d020"} Apr 24 21:37:24.634765 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:24.634721 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" event={"ID":"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3","Type":"ContainerStarted","Data":"34216ddd0342dc4d64c21ab7629f7f6ccdacea0dabb95383749f5186949e8cd5"} Apr 24 21:37:24.637850 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:24.637820 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" event={"ID":"f8e80edf-1330-461d-8e2e-52e2cf2c9b92","Type":"ContainerStarted","Data":"dee5d5ce5ba8756eea053a456da4708f28e62e94cbcdbd17de4124fccb301517"} Apr 24 21:37:26.645298 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:26.645262 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" event={"ID":"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3","Type":"ContainerStarted","Data":"2ff66c2f251f03fe06e90e7baebf2e974c84ce4832748ed4f5da9d5a9f8262f6"} Apr 24 21:37:26.645756 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:26.645383 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:37:26.647110 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:26.647071 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" event={"ID":"f8e80edf-1330-461d-8e2e-52e2cf2c9b92","Type":"ContainerStarted","Data":"5e6ad5df1eba2a1d48166f1a5cc9fbd05e1e110c9bc2fc1800fa1057c1d72b3d"} Apr 24 21:37:26.647215 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:26.647201 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:37:26.667311 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:26.667256 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" podStartSLOduration=1.607383806 podStartE2EDuration="26.667242064s" podCreationTimestamp="2026-04-24 21:37:00 +0000 UTC" firstStartedPulling="2026-04-24 21:37:01.367781682 +0000 UTC m=+624.140318763" lastFinishedPulling="2026-04-24 21:37:26.427639939 +0000 UTC m=+649.200177021" observedRunningTime="2026-04-24 21:37:26.66583985 +0000 UTC m=+649.438376957" watchObservedRunningTime="2026-04-24 21:37:26.667242064 +0000 UTC m=+649.439779168" Apr 24 21:37:26.687174 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:26.687064 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" podStartSLOduration=2.788888569 podStartE2EDuration="24.687048632s" podCreationTimestamp="2026-04-24 21:37:02 +0000 UTC" firstStartedPulling="2026-04-24 21:37:04.523074848 +0000 UTC m=+627.295611929" lastFinishedPulling="2026-04-24 21:37:26.421234896 +0000 UTC m=+649.193771992" observedRunningTime="2026-04-24 21:37:26.684215964 +0000 UTC m=+649.456753071" watchObservedRunningTime="2026-04-24 21:37:26.687048632 +0000 UTC m=+649.459585736" Apr 24 21:37:27.650524 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:27.650495 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:37:27.650524 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:27.650526 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:37:27.651456 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:27.651427 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 24 21:37:27.651590 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:27.651470 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 24 21:37:28.653007 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:28.652963 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 24 21:37:28.653432 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:28.652966 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 24 21:37:33.657764 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:33.657732 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:37:33.658292 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:33.658265 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 24 21:37:33.658393 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:33.658378 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:37:33.658771 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:33.658745 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 24 21:37:43.658349 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:43.658299 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 24 21:37:43.659004 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:43.658790 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 24 21:37:53.659114 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:53.659059 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 24 21:37:53.659480 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:37:53.659059 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 24 21:38:03.658795 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:38:03.658754 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 24 21:38:03.659176 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:38:03.658754 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 24 21:38:13.658522 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:38:13.658481 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 24 21:38:13.658955 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:38:13.658784 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 24 21:38:23.658718 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:38:23.658636 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 24 21:38:23.659172 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:38:23.658795 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 24 21:38:33.659238 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:38:33.659210 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:38:33.659610 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:38:33.659391 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:39:11.085061 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:11.085027 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8"] Apr 24 21:39:11.085680 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:11.085478 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="kserve-container" containerID="cri-o://dee5d5ce5ba8756eea053a456da4708f28e62e94cbcdbd17de4124fccb301517" gracePeriod=30 Apr 24 21:39:11.085680 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:11.085512 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="kube-rbac-proxy" containerID="cri-o://5e6ad5df1eba2a1d48166f1a5cc9fbd05e1e110c9bc2fc1800fa1057c1d72b3d" gracePeriod=30 Apr 24 21:39:11.133828 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:11.133788 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh"] Apr 24 21:39:11.134197 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:11.134158 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="kserve-container" containerID="cri-o://34216ddd0342dc4d64c21ab7629f7f6ccdacea0dabb95383749f5186949e8cd5" gracePeriod=30 Apr 24 21:39:11.134336 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:11.134242 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="kube-rbac-proxy" containerID="cri-o://2ff66c2f251f03fe06e90e7baebf2e974c84ce4832748ed4f5da9d5a9f8262f6" gracePeriod=30 Apr 24 21:39:11.951094 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:11.951058 2574 generic.go:358] "Generic (PLEG): container finished" podID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerID="2ff66c2f251f03fe06e90e7baebf2e974c84ce4832748ed4f5da9d5a9f8262f6" exitCode=2 Apr 24 21:39:11.951295 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:11.951139 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" event={"ID":"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3","Type":"ContainerDied","Data":"2ff66c2f251f03fe06e90e7baebf2e974c84ce4832748ed4f5da9d5a9f8262f6"} Apr 24 21:39:11.952755 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:11.952733 2574 generic.go:358] "Generic (PLEG): container finished" podID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerID="5e6ad5df1eba2a1d48166f1a5cc9fbd05e1e110c9bc2fc1800fa1057c1d72b3d" exitCode=2 Apr 24 21:39:11.952885 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:11.952807 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" event={"ID":"f8e80edf-1330-461d-8e2e-52e2cf2c9b92","Type":"ContainerDied","Data":"5e6ad5df1eba2a1d48166f1a5cc9fbd05e1e110c9bc2fc1800fa1057c1d72b3d"} Apr 24 21:39:13.653042 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:13.653002 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.18:8643/healthz\": dial tcp 10.133.0.18:8643: connect: connection refused" Apr 24 21:39:13.653423 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:13.653008 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.17:8643/healthz\": dial tcp 10.133.0.17:8643: connect: connection refused" Apr 24 21:39:13.658473 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:13.658453 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 24 21:39:13.659661 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:13.659643 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 24 21:39:15.434878 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.434854 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:39:15.480325 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.480302 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:39:15.502616 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.502591 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\" (UID: \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\") " Apr 24 21:39:15.502732 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.502654 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-proxy-tls\") pod \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\" (UID: \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\") " Apr 24 21:39:15.502732 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.502685 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mp9m\" (UniqueName: \"kubernetes.io/projected/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-kube-api-access-7mp9m\") pod \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\" (UID: \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\") " Apr 24 21:39:15.502732 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.502715 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-kserve-provision-location\") pod \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\" (UID: \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\") " Apr 24 21:39:15.502898 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.502757 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-proxy-tls\") pod \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\" (UID: \"f8e80edf-1330-461d-8e2e-52e2cf2c9b92\") " Apr 24 21:39:15.502898 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.502789 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57pd4\" (UniqueName: \"kubernetes.io/projected/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-kube-api-access-57pd4\") pod \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\" (UID: \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\") " Apr 24 21:39:15.502898 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.502826 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-kserve-provision-location\") pod \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\" (UID: \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\") " Apr 24 21:39:15.502898 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.502878 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\" (UID: \"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3\") " Apr 24 21:39:15.503077 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.503025 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config") pod "f8e80edf-1330-461d-8e2e-52e2cf2c9b92" (UID: "f8e80edf-1330-461d-8e2e-52e2cf2c9b92"). InnerVolumeSpecName "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:15.503077 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.503046 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f8e80edf-1330-461d-8e2e-52e2cf2c9b92" (UID: "f8e80edf-1330-461d-8e2e-52e2cf2c9b92"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:39:15.503209 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.503161 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-209.ec2.internal\" DevicePath \"\"" Apr 24 21:39:15.503283 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.503262 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-kserve-provision-location\") on node \"ip-10-0-133-209.ec2.internal\" DevicePath \"\"" Apr 24 21:39:15.503364 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.503342 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" (UID: "f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:39:15.503420 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.503381 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config") pod "f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" (UID: "f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3"). InnerVolumeSpecName "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:15.505031 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.505003 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f8e80edf-1330-461d-8e2e-52e2cf2c9b92" (UID: "f8e80edf-1330-461d-8e2e-52e2cf2c9b92"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:39:15.505031 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.505013 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" (UID: "f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:39:15.505301 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.505050 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-kube-api-access-57pd4" (OuterVolumeSpecName: "kube-api-access-57pd4") pod "f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" (UID: "f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3"). InnerVolumeSpecName "kube-api-access-57pd4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:39:15.505301 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.505142 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-kube-api-access-7mp9m" (OuterVolumeSpecName: "kube-api-access-7mp9m") pod "f8e80edf-1330-461d-8e2e-52e2cf2c9b92" (UID: "f8e80edf-1330-461d-8e2e-52e2cf2c9b92"). InnerVolumeSpecName "kube-api-access-7mp9m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:39:15.604592 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.604523 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-209.ec2.internal\" DevicePath \"\"" Apr 24 21:39:15.604592 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.604548 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-proxy-tls\") on node \"ip-10-0-133-209.ec2.internal\" DevicePath \"\"" Apr 24 21:39:15.604592 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.604558 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7mp9m\" (UniqueName: \"kubernetes.io/projected/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-kube-api-access-7mp9m\") on node \"ip-10-0-133-209.ec2.internal\" DevicePath \"\"" Apr 24 21:39:15.604592 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.604567 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f8e80edf-1330-461d-8e2e-52e2cf2c9b92-proxy-tls\") on node \"ip-10-0-133-209.ec2.internal\" DevicePath \"\"" Apr 24 21:39:15.604592 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.604577 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-57pd4\" (UniqueName: \"kubernetes.io/projected/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-kube-api-access-57pd4\") on node \"ip-10-0-133-209.ec2.internal\" DevicePath \"\"" Apr 24 21:39:15.604592 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.604585 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3-kserve-provision-location\") on node \"ip-10-0-133-209.ec2.internal\" DevicePath \"\"" Apr 24 21:39:15.965226 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.965197 2574 generic.go:358] "Generic (PLEG): container finished" podID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerID="34216ddd0342dc4d64c21ab7629f7f6ccdacea0dabb95383749f5186949e8cd5" exitCode=0 Apr 24 21:39:15.965404 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.965265 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" event={"ID":"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3","Type":"ContainerDied","Data":"34216ddd0342dc4d64c21ab7629f7f6ccdacea0dabb95383749f5186949e8cd5"} Apr 24 21:39:15.965404 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.965278 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" Apr 24 21:39:15.965404 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.965298 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh" event={"ID":"f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3","Type":"ContainerDied","Data":"240605e5be8fd4ec24fdb82848120f909b7aede6d422f59948614a9e489e0f4a"} Apr 24 21:39:15.965404 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.965319 2574 scope.go:117] "RemoveContainer" containerID="2ff66c2f251f03fe06e90e7baebf2e974c84ce4832748ed4f5da9d5a9f8262f6" Apr 24 21:39:15.967041 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.967016 2574 generic.go:358] "Generic (PLEG): container finished" podID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerID="dee5d5ce5ba8756eea053a456da4708f28e62e94cbcdbd17de4124fccb301517" exitCode=0 Apr 24 21:39:15.967189 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.967171 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" Apr 24 21:39:15.967284 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.967149 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" event={"ID":"f8e80edf-1330-461d-8e2e-52e2cf2c9b92","Type":"ContainerDied","Data":"dee5d5ce5ba8756eea053a456da4708f28e62e94cbcdbd17de4124fccb301517"} Apr 24 21:39:15.967348 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.967299 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8" event={"ID":"f8e80edf-1330-461d-8e2e-52e2cf2c9b92","Type":"ContainerDied","Data":"04b0fa6172830dc72710725210e78dc31dd7331fb3b79ccca76fd7058b2b716f"} Apr 24 21:39:15.973460 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.973442 2574 scope.go:117] "RemoveContainer" containerID="34216ddd0342dc4d64c21ab7629f7f6ccdacea0dabb95383749f5186949e8cd5" Apr 24 21:39:15.979971 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.979949 2574 scope.go:117] "RemoveContainer" containerID="2150d7e2a353b1c9cc75da136184ef6aaab12116f41acde87476203f9955479e" Apr 24 21:39:15.985416 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.985396 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh"] Apr 24 21:39:15.987314 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.987294 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-79868c757f-zc9fh"] Apr 24 21:39:15.990139 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.990123 2574 scope.go:117] "RemoveContainer" containerID="2ff66c2f251f03fe06e90e7baebf2e974c84ce4832748ed4f5da9d5a9f8262f6" Apr 24 21:39:15.990417 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:39:15.990396 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ff66c2f251f03fe06e90e7baebf2e974c84ce4832748ed4f5da9d5a9f8262f6\": container with ID starting with 2ff66c2f251f03fe06e90e7baebf2e974c84ce4832748ed4f5da9d5a9f8262f6 not found: ID does not exist" containerID="2ff66c2f251f03fe06e90e7baebf2e974c84ce4832748ed4f5da9d5a9f8262f6" Apr 24 21:39:15.990475 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.990425 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff66c2f251f03fe06e90e7baebf2e974c84ce4832748ed4f5da9d5a9f8262f6"} err="failed to get container status \"2ff66c2f251f03fe06e90e7baebf2e974c84ce4832748ed4f5da9d5a9f8262f6\": rpc error: code = NotFound desc = could not find container \"2ff66c2f251f03fe06e90e7baebf2e974c84ce4832748ed4f5da9d5a9f8262f6\": container with ID starting with 2ff66c2f251f03fe06e90e7baebf2e974c84ce4832748ed4f5da9d5a9f8262f6 not found: ID does not exist" Apr 24 21:39:15.990475 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.990443 2574 scope.go:117] "RemoveContainer" containerID="34216ddd0342dc4d64c21ab7629f7f6ccdacea0dabb95383749f5186949e8cd5" Apr 24 21:39:15.990675 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:39:15.990656 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34216ddd0342dc4d64c21ab7629f7f6ccdacea0dabb95383749f5186949e8cd5\": container with ID starting with 34216ddd0342dc4d64c21ab7629f7f6ccdacea0dabb95383749f5186949e8cd5 not found: ID does not exist" containerID="34216ddd0342dc4d64c21ab7629f7f6ccdacea0dabb95383749f5186949e8cd5" Apr 24 21:39:15.990715 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.990682 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34216ddd0342dc4d64c21ab7629f7f6ccdacea0dabb95383749f5186949e8cd5"} err="failed to get container status \"34216ddd0342dc4d64c21ab7629f7f6ccdacea0dabb95383749f5186949e8cd5\": rpc error: code = NotFound desc = could not find container \"34216ddd0342dc4d64c21ab7629f7f6ccdacea0dabb95383749f5186949e8cd5\": container with ID starting with 34216ddd0342dc4d64c21ab7629f7f6ccdacea0dabb95383749f5186949e8cd5 not found: ID does not exist" Apr 24 21:39:15.990715 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.990698 2574 scope.go:117] "RemoveContainer" containerID="2150d7e2a353b1c9cc75da136184ef6aaab12116f41acde87476203f9955479e" Apr 24 21:39:15.990927 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:39:15.990912 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2150d7e2a353b1c9cc75da136184ef6aaab12116f41acde87476203f9955479e\": container with ID starting with 2150d7e2a353b1c9cc75da136184ef6aaab12116f41acde87476203f9955479e not found: ID does not exist" containerID="2150d7e2a353b1c9cc75da136184ef6aaab12116f41acde87476203f9955479e" Apr 24 21:39:15.990979 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.990930 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2150d7e2a353b1c9cc75da136184ef6aaab12116f41acde87476203f9955479e"} err="failed to get container status \"2150d7e2a353b1c9cc75da136184ef6aaab12116f41acde87476203f9955479e\": rpc error: code = NotFound desc = could not find container \"2150d7e2a353b1c9cc75da136184ef6aaab12116f41acde87476203f9955479e\": container with ID starting with 2150d7e2a353b1c9cc75da136184ef6aaab12116f41acde87476203f9955479e not found: ID does not exist" Apr 24 21:39:15.990979 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.990944 2574 scope.go:117] "RemoveContainer" containerID="5e6ad5df1eba2a1d48166f1a5cc9fbd05e1e110c9bc2fc1800fa1057c1d72b3d" Apr 24 21:39:15.997046 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.997027 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8"] Apr 24 21:39:15.997602 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:15.997590 2574 scope.go:117] "RemoveContainer" containerID="dee5d5ce5ba8756eea053a456da4708f28e62e94cbcdbd17de4124fccb301517" Apr 24 21:39:16.000809 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:16.000790 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7758df598f-nldq8"] Apr 24 21:39:16.004149 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:16.004131 2574 scope.go:117] "RemoveContainer" containerID="68435c6e92449cc46869795d0d95258d4f790fe37bc1e3497bc64f8fbc80d020" Apr 24 21:39:16.010395 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:16.010377 2574 scope.go:117] "RemoveContainer" containerID="5e6ad5df1eba2a1d48166f1a5cc9fbd05e1e110c9bc2fc1800fa1057c1d72b3d" Apr 24 21:39:16.010636 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:39:16.010618 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e6ad5df1eba2a1d48166f1a5cc9fbd05e1e110c9bc2fc1800fa1057c1d72b3d\": container with ID starting with 5e6ad5df1eba2a1d48166f1a5cc9fbd05e1e110c9bc2fc1800fa1057c1d72b3d not found: ID does not exist" containerID="5e6ad5df1eba2a1d48166f1a5cc9fbd05e1e110c9bc2fc1800fa1057c1d72b3d" Apr 24 21:39:16.010698 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:16.010646 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e6ad5df1eba2a1d48166f1a5cc9fbd05e1e110c9bc2fc1800fa1057c1d72b3d"} err="failed to get container status \"5e6ad5df1eba2a1d48166f1a5cc9fbd05e1e110c9bc2fc1800fa1057c1d72b3d\": rpc error: code = NotFound desc = could not find container \"5e6ad5df1eba2a1d48166f1a5cc9fbd05e1e110c9bc2fc1800fa1057c1d72b3d\": container with ID starting with 5e6ad5df1eba2a1d48166f1a5cc9fbd05e1e110c9bc2fc1800fa1057c1d72b3d not found: ID does not exist" Apr 24 21:39:16.010698 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:16.010670 2574 scope.go:117] "RemoveContainer" containerID="dee5d5ce5ba8756eea053a456da4708f28e62e94cbcdbd17de4124fccb301517" Apr 24 21:39:16.010904 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:39:16.010888 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dee5d5ce5ba8756eea053a456da4708f28e62e94cbcdbd17de4124fccb301517\": container with ID starting with dee5d5ce5ba8756eea053a456da4708f28e62e94cbcdbd17de4124fccb301517 not found: ID does not exist" containerID="dee5d5ce5ba8756eea053a456da4708f28e62e94cbcdbd17de4124fccb301517" Apr 24 21:39:16.010967 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:16.010910 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dee5d5ce5ba8756eea053a456da4708f28e62e94cbcdbd17de4124fccb301517"} err="failed to get container status \"dee5d5ce5ba8756eea053a456da4708f28e62e94cbcdbd17de4124fccb301517\": rpc error: code = NotFound desc = could not find container \"dee5d5ce5ba8756eea053a456da4708f28e62e94cbcdbd17de4124fccb301517\": container with ID starting with dee5d5ce5ba8756eea053a456da4708f28e62e94cbcdbd17de4124fccb301517 not found: ID does not exist" Apr 24 21:39:16.010967 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:16.010932 2574 scope.go:117] "RemoveContainer" containerID="68435c6e92449cc46869795d0d95258d4f790fe37bc1e3497bc64f8fbc80d020" Apr 24 21:39:16.011154 ip-10-0-133-209 kubenswrapper[2574]: E0424 21:39:16.011136 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68435c6e92449cc46869795d0d95258d4f790fe37bc1e3497bc64f8fbc80d020\": container with ID starting with 68435c6e92449cc46869795d0d95258d4f790fe37bc1e3497bc64f8fbc80d020 not found: ID does not exist" containerID="68435c6e92449cc46869795d0d95258d4f790fe37bc1e3497bc64f8fbc80d020" Apr 24 21:39:16.011208 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:16.011158 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68435c6e92449cc46869795d0d95258d4f790fe37bc1e3497bc64f8fbc80d020"} err="failed to get container status \"68435c6e92449cc46869795d0d95258d4f790fe37bc1e3497bc64f8fbc80d020\": rpc error: code = NotFound desc = could not find container \"68435c6e92449cc46869795d0d95258d4f790fe37bc1e3497bc64f8fbc80d020\": container with ID starting with 68435c6e92449cc46869795d0d95258d4f790fe37bc1e3497bc64f8fbc80d020 not found: ID does not exist" Apr 24 21:39:17.787825 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:17.787787 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" path="/var/lib/kubelet/pods/f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3/volumes" Apr 24 21:39:17.788290 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:39:17.788275 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" path="/var/lib/kubelet/pods/f8e80edf-1330-461d-8e2e-52e2cf2c9b92/volumes" Apr 24 21:41:37.694165 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:41:37.694016 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 21:41:37.696580 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:41:37.696559 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 21:46:37.716803 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:46:37.716774 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 21:46:37.719082 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:46:37.719063 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 21:51:37.736754 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:51:37.736724 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 21:51:37.740237 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:51:37.740213 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 21:56:37.757748 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:56:37.757720 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 21:56:37.760963 ip-10-0-133-209 kubenswrapper[2574]: I0424 21:56:37.760942 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 22:01:37.775360 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:01:37.775330 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 22:01:37.779742 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:01:37.779721 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 22:06:37.792721 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:06:37.792694 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 22:06:37.797465 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:06:37.797445 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 22:11:37.812906 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:11:37.812795 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 22:11:37.816789 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:11:37.816431 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 22:16:37.830861 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:37.830756 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 22:16:37.839432 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:37.834266 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 22:16:38.591354 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:38.591325 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-h6cws_500cc7d2-1561-40b2-956f-6e2b94ec6ebc/global-pull-secret-syncer/0.log" Apr 24 22:16:38.810062 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:38.810029 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-tf6b4_01d20a9b-0255-4507-a8b5-862da4147c01/konnectivity-agent/0.log" Apr 24 22:16:38.836700 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:38.836669 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-209.ec2.internal_fa9ef8c20bc608bfc7fe350ef3c4a29b/haproxy/0.log" Apr 24 22:16:42.200125 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:42.200077 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6ff56797f7-48p5j_7eb25fc0-0010-4dbe-a36e-3fc0eb43985c/metrics-server/0.log" Apr 24 22:16:42.401474 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:42.401432 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nv7fx_18cd0973-265e-440e-a7e2-13e28f5fadd2/node-exporter/0.log" Apr 24 22:16:42.421733 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:42.421708 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nv7fx_18cd0973-265e-440e-a7e2-13e28f5fadd2/kube-rbac-proxy/0.log" Apr 24 22:16:42.444778 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:42.444759 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nv7fx_18cd0973-265e-440e-a7e2-13e28f5fadd2/init-textfile/0.log" Apr 24 22:16:42.483360 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:42.483343 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-kj7jb_b78255af-fb17-489c-94fb-b6f694bad656/kube-rbac-proxy-main/0.log" Apr 24 22:16:42.507726 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:42.507709 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-kj7jb_b78255af-fb17-489c-94fb-b6f694bad656/kube-rbac-proxy-self/0.log" Apr 24 22:16:42.530066 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:42.530044 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-kj7jb_b78255af-fb17-489c-94fb-b6f694bad656/openshift-state-metrics/0.log" Apr 24 22:16:45.764143 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.764088 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq"] Apr 24 22:16:45.764621 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.764472 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="kserve-container" Apr 24 22:16:45.764621 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.764491 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="kserve-container" Apr 24 22:16:45.764621 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.764505 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="storage-initializer" Apr 24 22:16:45.764621 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.764514 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="storage-initializer" Apr 24 22:16:45.764621 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.764527 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="storage-initializer" Apr 24 22:16:45.764621 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.764534 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="storage-initializer" Apr 24 22:16:45.764621 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.764554 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="kube-rbac-proxy" Apr 24 22:16:45.764621 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.764562 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="kube-rbac-proxy" Apr 24 22:16:45.764621 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.764572 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="kserve-container" Apr 24 22:16:45.764621 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.764580 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="kserve-container" Apr 24 22:16:45.764621 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.764589 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="kube-rbac-proxy" Apr 24 22:16:45.764621 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.764597 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="kube-rbac-proxy" Apr 24 22:16:45.765188 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.764668 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="kube-rbac-proxy" Apr 24 22:16:45.765188 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.764682 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4bdf6b1-e39b-4813-8d7c-e4b8a0bdafb3" containerName="kserve-container" Apr 24 22:16:45.765188 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.764695 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="kube-rbac-proxy" Apr 24 22:16:45.765188 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.764705 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8e80edf-1330-461d-8e2e-52e2cf2c9b92" containerName="kserve-container" Apr 24 22:16:45.766509 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.766488 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:45.769205 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.769185 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7csvq\"/\"default-dockercfg-vv79p\"" Apr 24 22:16:45.769312 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.769186 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7csvq\"/\"openshift-service-ca.crt\"" Apr 24 22:16:45.770213 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.770193 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7csvq\"/\"kube-root-ca.crt\"" Apr 24 22:16:45.776360 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.776340 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq"] Apr 24 22:16:45.857706 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.857681 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bea18c95-773c-4a37-bdef-f17a477cf773-sys\") pod \"perf-node-gather-daemonset-7nrzq\" (UID: \"bea18c95-773c-4a37-bdef-f17a477cf773\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:45.857706 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.857710 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bea18c95-773c-4a37-bdef-f17a477cf773-proc\") pod \"perf-node-gather-daemonset-7nrzq\" (UID: \"bea18c95-773c-4a37-bdef-f17a477cf773\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:45.857906 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.857736 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmkvj\" (UniqueName: \"kubernetes.io/projected/bea18c95-773c-4a37-bdef-f17a477cf773-kube-api-access-cmkvj\") pod \"perf-node-gather-daemonset-7nrzq\" (UID: \"bea18c95-773c-4a37-bdef-f17a477cf773\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:45.857906 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.857789 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bea18c95-773c-4a37-bdef-f17a477cf773-lib-modules\") pod \"perf-node-gather-daemonset-7nrzq\" (UID: \"bea18c95-773c-4a37-bdef-f17a477cf773\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:45.857906 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.857871 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bea18c95-773c-4a37-bdef-f17a477cf773-podres\") pod \"perf-node-gather-daemonset-7nrzq\" (UID: \"bea18c95-773c-4a37-bdef-f17a477cf773\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:45.958586 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.958558 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bea18c95-773c-4a37-bdef-f17a477cf773-podres\") pod \"perf-node-gather-daemonset-7nrzq\" (UID: \"bea18c95-773c-4a37-bdef-f17a477cf773\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:45.958695 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.958610 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bea18c95-773c-4a37-bdef-f17a477cf773-sys\") pod \"perf-node-gather-daemonset-7nrzq\" (UID: \"bea18c95-773c-4a37-bdef-f17a477cf773\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:45.958695 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.958636 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bea18c95-773c-4a37-bdef-f17a477cf773-proc\") pod \"perf-node-gather-daemonset-7nrzq\" (UID: \"bea18c95-773c-4a37-bdef-f17a477cf773\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:45.958695 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.958674 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmkvj\" (UniqueName: \"kubernetes.io/projected/bea18c95-773c-4a37-bdef-f17a477cf773-kube-api-access-cmkvj\") pod \"perf-node-gather-daemonset-7nrzq\" (UID: \"bea18c95-773c-4a37-bdef-f17a477cf773\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:45.958839 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.958701 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bea18c95-773c-4a37-bdef-f17a477cf773-lib-modules\") pod \"perf-node-gather-daemonset-7nrzq\" (UID: \"bea18c95-773c-4a37-bdef-f17a477cf773\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:45.958839 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.958710 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bea18c95-773c-4a37-bdef-f17a477cf773-sys\") pod \"perf-node-gather-daemonset-7nrzq\" (UID: \"bea18c95-773c-4a37-bdef-f17a477cf773\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:45.958839 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.958712 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bea18c95-773c-4a37-bdef-f17a477cf773-podres\") pod \"perf-node-gather-daemonset-7nrzq\" (UID: \"bea18c95-773c-4a37-bdef-f17a477cf773\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:45.958839 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.958738 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bea18c95-773c-4a37-bdef-f17a477cf773-proc\") pod \"perf-node-gather-daemonset-7nrzq\" (UID: \"bea18c95-773c-4a37-bdef-f17a477cf773\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:45.958839 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.958820 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bea18c95-773c-4a37-bdef-f17a477cf773-lib-modules\") pod \"perf-node-gather-daemonset-7nrzq\" (UID: \"bea18c95-773c-4a37-bdef-f17a477cf773\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:45.966614 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:45.966592 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmkvj\" (UniqueName: \"kubernetes.io/projected/bea18c95-773c-4a37-bdef-f17a477cf773-kube-api-access-cmkvj\") pod \"perf-node-gather-daemonset-7nrzq\" (UID: \"bea18c95-773c-4a37-bdef-f17a477cf773\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:46.076773 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:46.076717 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:46.194895 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:46.194864 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq"] Apr 24 22:16:46.197772 ip-10-0-133-209 kubenswrapper[2574]: W0424 22:16:46.197742 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbea18c95_773c_4a37_bdef_f17a477cf773.slice/crio-a5083e7c1d5a4782dd0dc3673965194a90c3e0841d9c92ad1a3046ada2e3b4fb WatchSource:0}: Error finding container a5083e7c1d5a4782dd0dc3673965194a90c3e0841d9c92ad1a3046ada2e3b4fb: Status 404 returned error can't find the container with id a5083e7c1d5a4782dd0dc3673965194a90c3e0841d9c92ad1a3046ada2e3b4fb Apr 24 22:16:46.199408 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:46.199387 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:16:46.288849 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:46.288824 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w7kpc_629df89e-192f-4942-ba14-4cb4f95cef70/dns/0.log" Apr 24 22:16:46.318013 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:46.317996 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w7kpc_629df89e-192f-4942-ba14-4cb4f95cef70/kube-rbac-proxy/0.log" Apr 24 22:16:46.340488 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:46.340439 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6jctc_3fe6ee90-5f50-4532-8f34-d91e4dc1fccd/dns-node-resolver/0.log" Apr 24 22:16:46.860963 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:46.860935 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sfdpv_572a1510-69a8-48bf-94b2-311fd0c0d92f/node-ca/0.log" Apr 24 22:16:47.092294 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:47.092264 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" event={"ID":"bea18c95-773c-4a37-bdef-f17a477cf773","Type":"ContainerStarted","Data":"a9ee5caeb79b3897fd6d416b283e6333e08303839f74908c15c300087c1b6044"} Apr 24 22:16:47.092294 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:47.092299 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" event={"ID":"bea18c95-773c-4a37-bdef-f17a477cf773","Type":"ContainerStarted","Data":"a5083e7c1d5a4782dd0dc3673965194a90c3e0841d9c92ad1a3046ada2e3b4fb"} Apr 24 22:16:47.092501 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:47.092331 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:47.111424 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:47.111341 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" podStartSLOduration=2.111327365 podStartE2EDuration="2.111327365s" podCreationTimestamp="2026-04-24 22:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:16:47.109744302 +0000 UTC m=+3009.882281404" watchObservedRunningTime="2026-04-24 22:16:47.111327365 +0000 UTC m=+3009.883864468" Apr 24 22:16:47.920440 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:47.920411 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2xr7h_59fef02e-c780-4d6e-a4b6-1ffe904c5a5a/serve-healthcheck-canary/0.log" Apr 24 22:16:48.399464 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:48.399433 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-p4ctn_aad14a0d-3139-41e7-b2f7-77dbdab344ac/kube-rbac-proxy/0.log" Apr 24 22:16:48.418667 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:48.418634 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-p4ctn_aad14a0d-3139-41e7-b2f7-77dbdab344ac/exporter/0.log" Apr 24 22:16:48.439825 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:48.439801 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-p4ctn_aad14a0d-3139-41e7-b2f7-77dbdab344ac/extractor/0.log" Apr 24 22:16:50.470014 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:50.469984 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-fksh8_6b5780f0-47fe-431a-bc89-537580f83a52/manager/0.log" Apr 24 22:16:53.105092 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:53.105066 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-7nrzq" Apr 24 22:16:56.511983 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:56.511950 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hg8hv_425e4fdf-8950-4297-b9c8-488b3e610f40/kube-multus-additional-cni-plugins/0.log" Apr 24 22:16:56.536610 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:56.536590 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hg8hv_425e4fdf-8950-4297-b9c8-488b3e610f40/egress-router-binary-copy/0.log" Apr 24 22:16:56.568901 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:56.568882 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hg8hv_425e4fdf-8950-4297-b9c8-488b3e610f40/cni-plugins/0.log" Apr 24 22:16:56.593230 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:56.593211 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hg8hv_425e4fdf-8950-4297-b9c8-488b3e610f40/bond-cni-plugin/0.log" Apr 24 22:16:56.620713 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:56.620610 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hg8hv_425e4fdf-8950-4297-b9c8-488b3e610f40/routeoverride-cni/0.log" Apr 24 22:16:56.648484 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:56.648460 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hg8hv_425e4fdf-8950-4297-b9c8-488b3e610f40/whereabouts-cni-bincopy/0.log" Apr 24 22:16:56.674806 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:56.674783 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hg8hv_425e4fdf-8950-4297-b9c8-488b3e610f40/whereabouts-cni/0.log" Apr 24 22:16:56.717564 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:56.717540 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dqnl7_af21b504-2a52-42ab-82d6-71911cc6a655/kube-multus/0.log" Apr 24 22:16:56.784969 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:56.784947 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-c4ck8_52e9a5f8-832a-4f1e-add3-f10bf674757e/network-metrics-daemon/0.log" Apr 24 22:16:56.808287 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:56.808265 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-c4ck8_52e9a5f8-832a-4f1e-add3-f10bf674757e/kube-rbac-proxy/0.log" Apr 24 22:16:57.689167 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:57.689089 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-controller/0.log" Apr 24 22:16:57.704642 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:57.704613 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/0.log" Apr 24 22:16:57.730179 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:57.730138 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovn-acl-logging/1.log" Apr 24 22:16:57.754469 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:57.754431 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/kube-rbac-proxy-node/0.log" Apr 24 22:16:57.778971 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:57.778938 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:16:57.795827 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:57.795805 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/northd/0.log" Apr 24 22:16:57.818461 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:57.818442 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/nbdb/0.log" Apr 24 22:16:57.838531 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:57.838507 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/sbdb/0.log" Apr 24 22:16:58.008788 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:58.008758 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9jw7k_6da7ad0f-d1bd-4849-8159-3fc9d885aaa7/ovnkube-controller/0.log" Apr 24 22:16:59.748518 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:16:59.748486 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-mzzxk_1daccc38-8893-4df5-b7d6-357c27b4e705/network-check-target-container/0.log" Apr 24 22:17:00.750647 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:17:00.750591 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-wxlkl_967d645a-e8ea-4968-bac5-2446d61f1581/iptables-alerter/0.log" Apr 24 22:17:01.408153 ip-10-0-133-209 kubenswrapper[2574]: I0424 22:17:01.408127 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-h4s4s_c3c7543c-7f18-44b0-b64e-519be8319862/tuned/0.log"